42 resultados para multiple discriminant analysis
em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast
Resumo:
In this study, the surface properties of and work required to remove 12 commercially available and developmental catheters from a model biological medium (agar), a measure of catheter lubricity, were characterised and the relationships between these properties were examined using multiple regression and correlation analysis. The work required for removal of catheter sections (7 cm) from a model biological medium (1% w/w agar) were examined using tensile analysis. The water wettability of the catheters were characterised using dynamic contact angle analysis, whereas surface roughness was determined using atomic force microscopy. Significant differences in the ease of removal were observed between the various catheters, with the silicone-based materials generally exhibiting the greatest ease of removal. Similarly, the catheters exhibited a range of advancing and receding contact angles that were dependent on the chemical nature of each catheter. Finally, whilst the microrugosities of the various catheters differed, no specific relationship to the chemical nature of the biomaterial was apparent. Using multiple regression analysis, the relationship between ease of removal, receding contact angle and surface roughness was defined as: Work done (N mm) 17.18 + 0.055 Rugosity (nm)-0.52 Receding contact angle (degrees) (r = 0.49). Interestingly, whilst the relationship between ease of removal and surface roughness was significant (r = 0.48, p = 0.0005), in which catheter lubricity increased as the surface roughness decreased, this was not the case with the relationship between ease of removal and receding contact angle (r = -0.18, p > 0.05). This study has therefore uniquely defined the contributions of each of these surface properties to catheter lubricity. Accordingly, in the design of urethral catheters. it is recommended that due consideration should be directed towards biomaterial surface roughness to ensure maximal ease of catheter removal. Furthermore, using the method described in this study, differences in the lubricity of the various catheters were observed that may be apparent in their clinical use. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Purpose: Environmental turbulence including rapid changes in technology and markets has resulted in the need for new approaches to performance measurement and benchmarking. There is a need for studies that attempt to measure and benchmark upstream, leading or developmental aspects of organizations. Therefore, the aim of this paper is twofold. The first is to conduct an in-depth case analysis of lead performance measurement and benchmarking leading to the further development of a conceptual model derived from the extant literature and initial survey data. The second is to outline future research agendas that could further develop the framework and the subject area.
Design/methodology/approach: A multiple case analysis involving repeated in-depth interviews with managers in organisational areas of upstream influence in the case organisations.
Findings: It was found that the effect of external drivers for lead performance measurement and benchmarking was mediated by organisational context factors such as level of progression in business improvement methods. Moreover, the legitimation of the business improvement methods used for this purpose, although typical, had been extended beyond their original purpose with the development of bespoke sets of lead measures.
Practical implications: Examples of methods and lead measures are given that can be used by organizations in developing a programme of lead performance measurement and benchmarking.
Originality/value: There is a paucity of in-depth studies relating to the theory and practice of lead performance measurement and benchmarking in organisations.
Absorbing new knowledge in small and medium-sized enterprises: A multiple case analysis of Six Sigma
Resumo:
The primary aim of this article is to critically analyse the development of Six Sigma theory and practice within small and medium-sized enterprises (SMEs) using a multiple case study approach. The article also explores the subsequent development of Lean Six Sigma as a means of addressing the perceived limitations of the efficacy of Six Sigma in this context. The overarching theoretical framework is that of absorptive capacity, where Six Sigma is conceptualized as new knowledge to be absorbed by smaller firms. The findings from a multiple case study involving repeat interviews and focus groups informed the development of an analytical model demonstrating the dynamic underlying routines for the absorptive capacity process and the development of a number of summative propositions relating the characteristics of SMEs to Six Sigma and Lean Six Sigma implementation.
Resumo:
A geostatistical version of the classical Fisher rule (linear discriminant analysis) is presented.This method is applicable when a large dataset of multivariate observations is available within a domain split in several known subdomains, and it assumes that the variograms (or covariance functions) are comparable between subdomains, which only differ in the mean values of the available variables. The method consists on finding the eigen-decomposition of the matrix W-1B, where W is the matrix of sills of all direct- and cross-variograms, and B is the covariance matrix of the vectors of weighted means within each subdomain, obtained by generalized least squares. The method is used to map peat blanket occurrence in Northern Ireland, with data from the Tellus
survey, which requires a minimal change to the general recipe: to use compositionally-compliant variogram tools and models, and work with log-ratio transformed data.
Resumo:
Logistic regression and Gaussian mixture model (GMM) classifiers have been trained to estimate the probability of acute myocardial infarction (AMI) in patients based upon the concentrations of a panel of cardiac markers. The panel consists of two new markers, fatty acid binding protein (FABP) and glycogen phosphorylase BB (GPBB), in addition to the traditional cardiac troponin I (cTnI), creatine kinase MB (CKMB) and myoglobin. The effect of using principal component analysis (PCA) and Fisher discriminant analysis (FDA) to preprocess the marker concentrations was also investigated. The need for classifiers to give an accurate estimate of the probability of AMI is argued and three categories of performance measure are described, namely discriminatory ability, sharpness, and reliability. Numerical performance measures for each category are given and applied. The optimum classifier, based solely upon the samples take on admission, was the logistic regression classifier using FDA preprocessing. This gave an accuracy of 0.85 (95% confidence interval: 0.78-0.91) and a normalised Brier score of 0.89. When samples at both admission and a further time, 1-6 h later, were included, the performance increased significantly, showing that logistic regression classifiers can indeed use the information from the five cardiac markers to accurately and reliably estimate the probability AMI. © Springer-Verlag London Limited 2008.
Resumo:
Continuous large-scale changes in technology and the globalization of markets have resulted in the need for many SMEs to use innovation as a means of seeking competitive advantage where innovation includes both technological and organizational perspectives (Tapscott, 2009). However, there is a paucity of systematic and empirical research relating to the implementation of innovation management in the context of SMEs. The aim of this article is to redress this imbalance via an empirical study created to develop and test a model of innovation implementation in SMEs. This study uses Structural Equation Modelling (SEM) to test the plausibility of an innovation model, developed from earlier studies, as the basis of a questionnaire survey of 395 SMEs in the UK. The resultant model and construct relationship results are further probed using an explanatory multiple case analysis to explore ‘how’ and ‘why’ type questions within the model and construct relationships. The findings show that the
effects of leadership, people and culture on innovation implementation are mediated by business improvement activities relating to Total Quality Management/Continuous Improvement (TQM/CI) and product and process developments. It is concluded that SMEs have an opportunity to leverage existing quality and process improvement activities to move beyond continuous
improvement outcomes towards effective innovation implementation. The article concludes by suggesting areas suitable for further research.
Resumo:
This paper introduces a new technique for palmprint recognition based on Fisher Linear Discriminant Analysis (FLDA) and Gabor filter bank. This method involves convolving a palmprint image with a bank of Gabor filters at different scales and rotations for robust palmprint features extraction. Once these features are extracted, FLDA is applied for dimensionality reduction and class separability. Since the palmprint features are derived from the principal lines, wrinkles and texture along the palm area. One should carefully consider this fact when selecting the appropriate palm region for the feature extraction process in order to enhance recognition accuracy. To address this problem, an improved region of interest (ROI) extraction algorithm is introduced. This algorithm allows for an efficient extraction of the whole palm area by ignoring all the undesirable parts, such as the fingers and background. Experiments have shown that the proposed method yields attractive performances as evidenced by an Equal Error Rate (EER) of 0.03%.