909 resultados para measurement and metrology


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background. Exercise therapy improves functional capacity in CHF, but selection and individualization of training would be helped by a simple non-invasive marker of peak VO2. Peak VO2 in these pts is difficult to predict without direct measurement, and LV ejection fraction is a poor predictor. Myocardial tissue velocities are less load-dependent, and may be predictive of the exercise response in CHF pts. We sought to use tissue velocity as a predictor of peak VO2 in CHF pts. Methods. Resting 2D-echocardiography and tissue Doppler imaging were performed in 182 CHF pts (159 male, age 62±10 years) before and after metabolic exercise testing. The majority of these patients (129, 71%) had an ischemic cardiomyopathy, with resting EF of 35±13% and a peak VO2 of 13.5±4.7 ml/kg/min. Results. Neither resting EF (r=0.15) nor peak EF (r=0.18, both p=NS) were correlated with peak VO2. However, peak VO2 correlated with peak systolic velocity in septal (Vss, r=0.31) and lateral walls (Vsl, r=0.26, both p=0.01). In a general linear model (r2 = 0.25), peak VO2 was calculated from the following equation: 9.6 + 0.68*Vss - 0.09*age + 0.06*maximum HR. This model proved to be a superior predictor of peak VO2 (r=0.51, p=0.01) than the standard prediction equations of Wasserman (r= -0.12, p=0.01). Conclusions. Resting tissue Doppler, age and maximum heart rate may be used to predict functional capacity in CHF patients. This may be of use in selecting and following the response to therapy, including for exercise training.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Understanding arterial distensibility has shown to be important in the pathogenesis of cardiovascular abnormalities like hypertension. It is also known that arterial pulse wave velocity (PWV) is a measure of the elasticity or stiffness of peripheral arterial blood vessels. However, it generally requires complex instrumentations to have an accurate measurement and not suited for continual monitoring. In this paper, it describes a simple and non-intrusive method to detect the cardiovascular pulse from a human wrist above the radial artery and a fingertip. The main components of this proposed method are a piezoelectric transducer and a photo-plethysmography circuitry. 5 healthy adults (4 male) with age ranging from 25 to 38 years were recruited. The timing consistency of the detected pulsations is first evaluated and compared to that obtained from a commercial electrocardiogram. Furthermore, the derived PWV is then assessed by the predicted values attained from regression equations of two previous similar studies. The results show good correlations (p < 0.05) and similarities for the former and latter respectively. The simplicity and non-invasive nature of the proposed method can be attractive for even younger or badly disturbed patients. Moreover, it can be used for prolonged monitoring for the comfort of the patients.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – The purpose of the paper is to develop an integrated framework for performance management of healthcare services. Design/methodology/approach – This study develops a performance management framework for healthcare services using a combined analytic hierarchy process (AHP) and logical framework (LOGFRAME). The framework is then applied to the intensive care units of three different hospitals in developing nations. Numerous focus group discussions were undertaken, involving experts from the specific area under investigation. Findings – The study reveals that a combination of outcome, structure and process-based critical success factors and a combined AHP and LOGFRAME-based performance management framework helps manage performance of healthcare services. Practical implications – The proposed framework could be practiced in hospital-based healthcare services. Originality/value – The conventional approaches to healthcare performance management are either outcome-based or process-based, which cannot reveal improvement measures appropriately in order to assure superior performance. Additionally, they lack planning, implementing and evaluating improvement projects that are identified from performance measurement. This study presents an integrated approach to performance measurement and implementing framework of improvement projects.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose - This paper provides a deeper examination of the fundamentals of commonly-used techniques - such as coefficient alpha and factor analysis - in order to more strongly link the techniques used by marketing and social researchers to their underlying psychometric and statistical rationale. Design/methodology approach - A wide-ranging review and synthesis of psychometric and other measurement literature both within and outside the marketing field is used to illuminate and reconsider a number of misconceptions which seem to have evolved in marketing research. Findings - The research finds that marketing scholars have generally concentrated on reporting what are essentially arbitrary figures such as coefficient alpha, without fully understanding what these figures imply. It is argued that, if the link between theory and technique is not clearly understood, use of psychometric measure development tools actually runs the risk of detracting from the validity of the measures rather than enhancing it. Research limitations/implications - The focus on one stage of a particular form of measure development could be seen as rather specialised. The paper also runs the risk of increasing the amount of dogma surrounding measurement, which runs contrary to the spirit of this paper. Practical implications - This paper shows that researchers may need to spend more time interpreting measurement results. Rather than simply referring to precedence, one needs to understand the link between measurement theory and actual technique. Originality/value - This paper presents psychometric measurement and item analysis theory in easily understandable format, and offers an important set of conceptual tools for researchers in many fields. © Emerald Group Publishing Limited.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In 2002, we published a paper [Brock, J., Brown, C., Boucher, J., Rippon, G., 2002. The temporal binding deficit hypothesis of autism. Development and Psychopathology 142, 209-224] highlighting the parallels between the psychological model of 'central coherence' in information processing [Frith, U., 1989. Autism: Explaining the Enigma. Blackwell, Oxford] and the neuroscience model of neural integration or 'temporal binding'. We proposed that autism is associated with abnormalities of information integration that is caused by a reduction in the connectivity between specialised local neural networks in the brain and possible overconnectivity within the isolated individual neural assemblies. The current paper updates this model, providing a summary of theoretical and empirical advances in research implicating disordered connectivity in autism. This is in the context of changes in the approach to the core psychological deficits in autism, of greater emphasis on 'interactive specialisation' and the resultant stress on early and/or low-level deficits and their cascading effects on the developing brain [Johnson, M.H., Halit, H., Grice, S.J., Karmiloff-Smith, A., 2002. Neuroimaging of typical and atypical development: a perspective from multiple levels of analysis. Development and Psychopathology 14, 521-536].We also highlight recent developments in the measurement and modelling of connectivity, particularly in the emerging ability to track the temporal dynamics of the brain using electroencephalography (EEG) and magnetoencephalography (MEG) and to investigate the signal characteristics of this activity. This advance could be particularly pertinent in testing an emerging model of effective connectivity based on the balance between excitatory and inhibitory cortical activity [Rubenstein, J.L., Merzenich M.M., 2003. Model of autism: increased ratio of excitation/inhibition in key neural systems. Genes, Brain and Behavior 2, 255-267; Brown, C., Gruber, T., Rippon, G., Brock, J., Boucher, J., 2005. Gamma abnormalities during perception of illusory figures in autism. Cortex 41, 364-376]. Finally, we note that the consequence of this convergence of research developments not only enables a greater understanding of autism but also has implications for prevention and remediation. © 2006.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A novel wavelength-division-multiplexed in-fibre Bragg grating sensor system combined with high resolution drift-compensated interferometric wavelength-shift detection is described. This crosstalk-free system is based on the use of an interferometric wavelength scanner and a low resolution spectrometer. A four element system is demonstrated for temperature measurement, and a resolution of ±0.1°C has been achieved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Quality management is dominated by rational paradigms for the measurement and management of quality, but these paradigms start to “break down”, when faced with the inherent complexity of managing quality in intensely competitive changing environments. In this article, the various theoretical strategy paradigms employed to manage quality are reviewed and the advantages and limitations of these paradigms are highlighted. A major implication of this review is that when faced with complexity, an ideological stance to any single strategy paradigm for the management of quality is ineffective. A case study is used to demonstrate the need for an integrative multi-paradigm approach to the management of quality as complexity increases.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose - The purpose of this study is to develop a performance measurement model for service operations using the analytic hierarchy process approach. Design/methodology/approach - The study reviews current relevant literature on performance measurement and develops a model for performance measurement. The model is then applied to the intensive care units (ICUs) of three different hospitals in developing nations. Six focus group discussions were undertaken, involving experts from the specific area under investigation, in order to develop an understandable performance measurement model that was both quantitative and hierarchical. Findings - A combination of outcome, structure and process-based factors were used as a foundation for the model. The analyses of the links between them were used to reveal the relative importance of each and their associated sub factors. It was considered to be an effective quantitative tool by the stakeholders. Research limitations/implications - This research only applies the model to ICUs in healthcare services. Practical implications - Performance measurement is an important area within the operations management field. Although numerous models are routinely being deployed both in practice and research, there is always room for improvement. The present study proposes a hierarchical quantitative approach, which considers both subjective and objective performance criteria. Originality/value - This paper develops a hierarchical quantitative model for service performance measurement. It considers success factors with respect to outcomes, structure and processes with the involvement of the concerned stakeholders based upon the analytic hierarchy process approach. The unique model is applied to the ICUs of hospitals in order to demonstrate its effectiveness. The unique application provides a comparative international study of service performance measurement in ICUs of hospitals in three different countries. © Emerald Group Publishing Limited.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is an alternative model of the 1-way ANOVA called the 'random effects' model or ‘nested’ design in which the objective is not to test specific effects but to estimate the degree of variation of a particular measurement and to compare different sources of variation that influence the measurement in space and/or time. The most important statistics from a random effects model are the components of variance which estimate the variance associated with each of the sources of variation influencing a measurement. The nested design is particularly useful in preliminary experiments designed to estimate different sources of variation and in the planning of appropriate sampling strategies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: Optometrists are becoming more integrally involved in the diagnosis of and care for glaucoma patients in the UK. The correlation of apparent change in non contact tonometry (NCT) IOP measurement and change in other ocular parameters such as refractive error, corneal curvature, corneal thickness and treatment zone size (data available to optometrists after LASIK) would facilitate care of these patients. Setting: A UK Laser Eye Clinic. Methods: This is a retrospective study study of 200 sequential eyes with myopia with or without astigmatism which underwent LASIK using a Hansatome and an Alcon LADARvision 4000 excimer laser. Refraction keratometry, pachymetry and NCT IOP mesurements were taken before treatmebnt and agian 3 months after treatment. The relationship between these variables anfd teh treatment zones were studied using stepwise multiple regression analysis. Results: There was a mean difference of 5.54mmHg comnparing pre and postoperative NCT IOP. IOP change correlates with refractive error change (P < 0.001), preoperative corneal thickness (P < 0.001) and treatment zone size (P = 0.047). Preoperative corneal thickness correlates with preoperative IOP (P < 0.001) and postoperative IOP (P < 0.001). Using these correlations, the measured difference in NCT IIOP can be predicted preoperatively or postoperatively using derived equations.Conclusion: There is a significant reduction in measured NCT IOP after LASIK. The amount of reduction can be calculated using data acquired by optometrists. This is helpful for opthalmologists and optometrists who co-manage glaucoma patients who have had LASIK or with glaucoma pateints who are consideraing having LASIK.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research thesis is concerned with the human factors aspects of industrial alarm systems within human supervisory control tasks. Typically such systems are located in central control rooms, and the information may be presented via visual display units. The thesis develops a human, rather than engineering, centred approach to the assessment, measurement and analysis of the situation. A human factors methodology was employed to investigate the human requirements through: interviews, questionnaires, observation and controlled experiments. Based on the analysis of current industrial alarm systems in a variety of domains (power generation, manufacturing and coronary care), it is suggested that often designers do not pay due considerations to the human requirements. It is suggested that most alarm systems have severe shortcomings in human factors terms. The interviews, questionnaire and observations led to the proposal of 'alarm initiated activities' as a framework for the research to proceed. The framework comprises of six main stages: observe, accept, analyse, investigate, correct and monitor. This framework served as a basis for laboratory research into alarm media. Under consideration were speech-based alarm displays and visual alarm displays. Non-speech auditory displays were the subject of a literature review. The findings suggest that care needs to be taken when selecting the alarm media. Ideally it should be chosen to support the task requirements of the operator, rather than being arbitrarily assigned. It was also indicated that there may be some interference between the alarm initiated activities and the alarm media, i.e. information that supports one particular stage of alarm handling may interfere with another.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cell surface properties of the basidiomycete yeast Cryptococcus neoformans were investigated with a combination of novel and well proven approaches. Non-specific cell adhesion forces, as well as exposed carbohydrate and protein moieties potentially associated with specific cellular interaction, were analysed. Experimentation and analysis employed cryptococcal cells of different strains, capsular status and culture age. Investigation of cellular charge by particulate microelectrophoresis revealed encapsulated yeast forms of C. neoformans manifest a distinctive negative charge regardless of the age of cells involved; in turn, the neutral charge of acapsulate yeasts confirmed that the polysaccharide capsule, and not the cell wall, was responsible for this occurrence. Hydrophobicity was measured by MATH and HICH techniques, as well as by the attachment of polystyrene microspheres. All three techniques, where applicable, found C. neoformans yeast to be consistently hydrophilic; this state varied little regardless of strain and culture age. Cell surface carbohydrates and protein were investigated with novel fluorescent tagging protocols, flow cytometry and confocal microscopy. Cell surface carbohydrate was identified by controlled oxidation in association with biotin hydrazide and fluorescein-streptavidin tagging. Marked amounts of carbohydrate were measured and observed on the cell wall surface of cryptococcal yeasts. Furthermore, tagging of carbohydrates with selective fluorescent lectins supported the identification, measurement and observation of substantial amounts of mannose, glucose and N-acetyl-glucosamine. Cryptococcal cell surface protein was identified using sulfo-NHS-biotin with fluorescein-streptavidin, and then readily quantified by flow cytometry. Confocal imaging of surface exposed carbohydrate and protein revealed common localised areas of vivid fluorescence associated with buds, bud scars and nascent daughter cells. Carbohydrate and protein fluorescence often varied between strains, culture age and capsule status of cells examined. Finally, extension of protein tagging techniques resulted in the isolation and extraction of two biotinylated proteins from the yeast cell wall surface of an acapsulate strain of C.neoformans.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The thesis deals with a research programme in which the cutting performance of a new generation of ceramic cutting tool material is evaluated using the turning process. In part one, the performance of commercial Kyon 2000 sialon ceramic inserts is studied when machining a hardened alloy steel under a wide range of cutting conditions. The aim is to formulate a pattern of machining behaviour in which tool wear is related to a theoretical interpretation of the temperatures and stresses generated by the chip-tool interaction. The work involves a correlation of wear measurement and metallographic examination of the wear area with the measurable cutting data. Four main tool failure modes are recognised: (a) flank and crater wear (b) grooving wear (c) deformation wear and (d) brittle failure Results indicate catastrophic edge breakdown under certain conditions. Accordingly in part two, the edge geometry is modified to give a double rake tool; a negative/positive combination. The results are reported for a range of workpiece materials under orthogonal cutting conditions. Significant improvements in the cutting performance are achieved. The improvements are explained by a study of process parameters; cutting forces, chip thickness ratio, chip contact length, temperature distribution, stress distribution and chip formation. In part three, improvements in tool performance are shown to arise when the edge chamfer on a single rake tool is modified. Under optimum edge chamfer conditions a substantial increase in tool life is obtained compared with the commercial cutting geometry.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A procedure has been developed which measures the settling velocity distribution of particles within a complete sewage sample. The development of the test method included observations of particle and liquid interaction using both synthetic media and sewage. Comparison studies with two other currently used settling velocity test procedures was undertaken. The method is suitable for use with either DWF or storm sewage. Information relating to the catchment characteristics of 35 No. wastewater treatment works was collected from the privatised water companies in England and Wales. 29 No. of these sites were used in an experimental programme to determine the settling velocity grading of 33 No. sewage samples. The collected data were analysed in an attempt to relate the settling velocity distribution to the characteristics of the contributing catchment. Statistical analysis of the catchment data and the measured settling velocity distributions was undertaken. A curve fitting exercise using an S-shaped curve which had the same physical characteristics as the settling velocity distributions was performed. None of these analyses found evidence that the settling velocity distribution of sewage had a significant relationship with the chosen catchment characteristics. The regression equations produced from the statistical analysis cannot be used to assist in the design of separation devices. However, a grading curve envelope was produced, the limits of which were clearly defined for the measured data set. There was no evidence of a relationship between settling velocity grading and the characteristics of the contributing catchment, particularly the catchment area. The present empirical approach to settling tank design cannot be improved upon at present by considering the variation in catchment parameters. This study has provided a basis for future research into the settling velocity measurement and should be of benefit to future workers within this field.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There are currently few biomaterials which combine controlled degradation rates with ease of melt processability. There are however, many applications ranging from surgical fixation devices to drug delivery systems which require such combination properties. The work in this thesis is an attempt to increase the availability of such materials. Polyhydroxybutyrate-polyhydroxyvalerate copolymers are a new class of potentially biodegradable materials, although little quantitative data relating to their in vitro and in vivo degradation behaviour exists. The hydrolytic degradation of these copolymers has been examined in vitro under conditions ranging from `physiological' to extremes of pH and elevated temperature. Progress of the degradation process was monitored by weight loss and water uptake measurement, x-ray diffractometry, optical and electron microscopy, together with changes in molecular weight by gel permeation chromatography. The extent to which the degradation mechanism could be modified by forming blends with polysaccharides and polycaprolactone was also investigated. Influence of the valerate content, molecular weight, crystallinity, together with the physical form of the sample, the pH and the temperature of the aqueous medium on the hydrolytic degradation was investigated. Its progress was characterised by an initial increase in the wet weight, with concurrent decrease in the dry weight as the amorphous regions of the polymer are eroded, thereby producing an increase in matrix porosity. With the polysaccharide blends, this initial rate is dramatically affected, and erosion of the polysaccharide from the matrix markedly increases the internal porosity which leads to the eventual collapse of the matrix, a process which occurs, but less rapidly, in the degradation of the unblended polyhydroxybutyrate-polyhydroxyvalerate copolymers. Surface energy measurement and goniophotometry proved potentially useful in monitoring the early stages of the degradation, where surface rather than bulk processes predominate and are characterised by little weight loss.