809 resultados para Multi-dimensional Numbered Information Spaces
Resumo:
Objective: Laryngeal and tongue function was assessed in 28 patients to evaluate the presence, nature, and resolution of superior recurrent laryngeal and hypoglossal nerve damage resulting from standard open primary carotid endarterectomy (CEA). Methods. The laryngeal and tongue function in 28 patients who underwent CEA were examined prospectively with various physiologic (Aerophone II, laryngograph, tongue transducer), acoustic (Multi-Dimensional Voice Program), and perceptual speech assessments. Measures were obtained from all participants preoperatively, and at 2 weeks and at 3 months postoperatively. Results. The perceptual speech assessment indicated that the vocal quality of roughness was significantly more apparent at the 2-week postoperative assessment than preoperatively. However, by the 3-month postoperative assessment these values had returned to near preoperative levels, with no significant difference detected between preoperative and 3-month postoperative levels or between 2-week and 3-month postoperative levels. Both the instrumental assessments of laryngeal function and the acoustic assessment of vocal quality failed to identify any significant difference on any measure across the three assessment periods. Similarly, no significant impairment in tongue strength, endurance, or rate of repetitive tongue movements was detected at instrumental assessment of tongue function. Conclusions: No permanent changes to vocal or tongue function occurred in this group of participants after primary CEA. The lack of any significant long-term laryngeal or tongue dysfunction in this group suggests that the standard open CEA procedure is not associated with high rates of superior recurrent and hypoglossal nerve dysfunction, as previously believed.
Resumo:
Nucleation is the first stage in any granulation process where binder liquid first comes into contact with the powder. This paper investigates the nucleation process where binder liquid is added to a fine powder with a spray nozzle. The dimensionless spray flux approach of Hapgood et al. (Powder Technol. 141 (2004) 20) is extended to account for nonuniform spray patterns and allow for overlap of nuclei granules rather than spray drops. A dimensionless nuclei distribution function which describes the effects of the design and operating parameters of the nucleation process (binder spray characteristics, the nucleation area ratio between droplets and nuclei and the powder bed velocity) on the fractional surface area coverage of nuclei on a moving powder bed is developed. From this starting point, a Monte Carlo nucleation model that simulates full nuclei size distributions as a function of the design and operating parameters that were implemented in the dimensionless nuclei distribution function is developed. The nucleation model was then used to investigate the effects of the design and operating parameters on the formed nuclei size distributions and to correlate these effects to changes of the dimensionless nuclei distribution function. Model simulations also showed that it is possible to predict nuclei size distributions beyond the drop controlled nucleation regime in Hapgood's nucleation regime map. Qualitative comparison of model simulations and experimental nucleation data showed similar shapes of the nuclei size distributions. In its current form, the nucleation model can replace the nucleation term in one-dimensional population balance models describing wet granulation processes. Implementation of more sophisticated nucleation kinetics can make the model applicable to multi-dimensional population balance models.
Resumo:
Solving many scientific problems requires effective regression and/or classification models for large high-dimensional datasets. Experts from these problem domains (e.g. biologists, chemists, financial analysts) have insights into the domain which can be helpful in developing powerful models but they need a modelling framework that helps them to use these insights. Data visualisation is an effective technique for presenting data and requiring feedback from the experts. A single global regression model can rarely capture the full behavioural variability of a huge multi-dimensional dataset. Instead, local regression models, each focused on a separate area of input space, often work better since the behaviour of different areas may vary. Classical local models such as Mixture of Experts segment the input space automatically, which is not always effective and it also lacks involvement of the domain experts to guide a meaningful segmentation of the input space. In this paper we addresses this issue by allowing domain experts to interactively segment the input space using data visualisation. The segmentation output obtained is then further used to develop effective local regression models.
Resumo:
Today, the data available to tackle many scientific challenges is vast in quantity and diverse in nature. The exploration of heterogeneous information spaces requires suitable mining algorithms as well as effective visual interfaces. miniDVMS v1.8 provides a flexible visual data mining framework which combines advanced projection algorithms developed in the machine learning domain and visual techniques developed in the information visualisation domain. The advantage of this interface is that the user is directly involved in the data mining process. Principled projection methods, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), are integrated with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates, and user interaction facilities, to provide this integrated visual data mining framework. The software also supports conventional visualisation techniques such as principal component analysis (PCA), Neuroscale, and PhiVis. This user manual gives an overview of the purpose of the software tool, highlights some of the issues to be taken care while creating a new model, and provides information about how to install and use the tool. The user manual does not require the readers to have familiarity with the algorithms it implements. Basic computing skills are enough to operate the software.
Resumo:
Innovation is part and parcel of any service in today's environment, so as to remain competitive. Quality improvement in healthcare services is a complex, multi-dimensional task. This study proposes innovation management in healthcare services using a logical framework. A problem tree and an objective tree are developed to identify and mitigate issues and concerns. A logical framework is formulated to develop a plan for implementation and monitoring strategies, potentially creating an environment for continuous quality improvement in a specific unit. We recommend logical framework as a valuable model for innovation management in healthcare services. Copyright © 2006 Inderscience Enterprises Ltd.
Resumo:
Purpose - The purpose of this paper is to develop an integrated quality management model that identifies problems, suggests solutions, develops a framework for implementation and helps to evaluate dynamically healthcare service performance. Design/methodology/approach - This study used the logical framework analysis (LFA) to improve the performance of healthcare service processes. LFA has three major steps - problems identification, solution derivation, and formation of a planning matrix for implementation. LFA has been applied in a case-study environment to three acute healthcare services (Operating Room utilisation, Accident and Emergency, and Intensive Care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This study shows LFA application in three service processes in one hospital. This very limited population sample needs to be extended. Practical implications - The proposed model can be implemented in hospital-based healthcare services in order to improve performance. It may also be applied to other services. Originality/value - Quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in healthcare delivery, they are not without flaws. There is an absence of an integrated approach, which can identify and analyse issues, provide solutions to resolve those issues, develop a project management framework to implement those solutions. This study introduces an integrated and uniform quality management tool for healthcare services. © Emerald Group Publishing Limited.
Resumo:
Purpose - The purpose of the paper is to develop an integrated quality management model, which identifies problems, suggests solutions, develops a framework for implementation and helps evaluate performance of health care services dynamically. Design/methodology/approach - This paper uses logical framework analysis (LFA), a matrix approach to project planning for managing quality. This has been applied to three acute healthcare services (Operating room utilization, Accident and emergency, and Intensive care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This paper shows LFA application in three service processes in one hospital. However, ideally this is required to be tested in several hospitals and other services as well. Practical implications - In the paper the proposed model can be practised in hospital-based healthcare services for improving performance. Originality/value - The paper shows that quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in health care delivery and corrective measures are taken for superior performance, there is an absence of an integrated approach, which can identify and analyze issues, provide solutions to resolve those issues, develop a project management framework (planning, monitoring, and evaluating) to implement those solutions in order to improve process performance. This study introduces an integrated and uniform quality management tool. It integrates operations with organizational strategies. © Emerald Group Publishing Limited.
Resumo:
Purpose – This paper aims to contribute to the debate on the drivers of the productivity gap that exists between the UK and its major international competitors. Design/methodology/approach – From the macro perspective the paper explores the quantitative evidence on the productivity differentials and how they are measured. From the micro perspective, the article explores the quantitative evidence on the role of management practices claimed to be a key determinant in promoting firm competitiveness and in bridging the UK gap. Findings – This study suggests that management practices are an ambiguous driver of firm productivity and higher firm performance. On the methodological side, qualitative and subjective measures of either management practices or firm performance are often used. This makes the results not comparable across studies, across firms or even within firms over time. Productivity and profitability are often and erroneously interchangeably used while productivity is only one element of firm performance. On the other hand, management practices are multi-dimensional constructs that generally do not demonstrate a straightforward relationship with productivity variables. To assume that they are the only driver of higher productivity may be misleading. Moreover, there is evidence of an inverse causal relationship between management practices and firm performance. This calls into question most empirical results of the extant literature based on the unidirectional assumption of direct causality between management practices and firm performance. Research limitations/implications – These and other issues suggest that more research is needed to deepen the understanding of the UK productivity gap and more quantitative evidence should be provided on the way in which management practices contribute to the UK competitiveness. Their impact is not easily measurable due to their complexity and their complementary nature and this is a fertile ground for further research. Originality/value – This paper brings together the evidence on the UK productivity gap and its main drivers, provided by the economics, management and performance measurement literature. This issue scores very highly in the agenda of policy makers and academics and has important implications for practitioners interested in evaluating the impact of managerial best practices.
Resumo:
Self-service technology is affecting the service encounter. The potential reduction in personal contact through self-service technology may affect assessments of consumer satisfaction and commitment, making it necessary to investigate self-service technology usage, particularly the long-term impact on consumers' relationships with service organisations. Thus, this paper presents a framework for investigating the impact of self-service technology on consumer satisfaction and on a multi-dimensional measure of consumer commitment. Illustrative quotes from exploratory in-depth interviews support the framework and lead to a set of propositions. Future research directions for testing the framework are also discussed, and potential implications of this research are outlined.
Resumo:
FDI plays a key role in development, particularly in resource-constrained transition economies of Central and Eastern Europe with relatively low savings rates. Gains from technology transfer play a critical role in motivating FDI, yet potential for it may be hampered by a large technology gap between the source and host country. While the extent of this gap has traditionally been attributed to education, skills and capital intensity, recent literature has also emphasized the possible role of institutional environment in this respect. Despite tremendous interest among policy-makers and academics to understand the factors attracting FDI (Bevan and Estrin, 2000; Globerman and Shapiro, 2003) our knowledge about the effects of institutions on the location choice and ownership structure of foreign firms remains limited. This paper attempts to fill this gap in the literature by examining the link between institutions and foreign ownership structures. To the best of our knowledge, Javorcik (2004) is the only papers, which use firm-level data to analyse the role of institutional quality on an outward investor’s entry mode in transition countries. Our paper extends Javorcik (2004) in a number of ways: (a) rather than a cross-section, we use panel data for the period 1997-2006; (b) rather than a binary variable, we use the percentage foreign ownership as continuous variable; (c) we consider multi-dimensional institutional variables, such as corruption, intellectual property rights protection and government stability. We also use factor analysis to generate a composite index of institutional quality and see how stronger institutional environment could affect foreign ownership; (d) we explore how the distance between institutional environment in source and host countries affect foreign ownership in a host country. The firm-level data used includes both domestic and foreign firms for the period 1997-2006 and is drawn from ORBIS, a commercially available dataset provided by Bureau van Dijk. In order to examine the link between institutions and foreign ownership structures, we estimate four log-linear ownership equations/specifications augmented by institutional and other control variables. We find evidence that the decision of a foreign firm to either locate its subsidiary or acquire an existing domestic firm depends not only on factor cost differences but also on differences in institutional environment between the host and source countries.
Resumo:
This paper begins by suggesting that when considering Corporate Social Responsibility (CSR), even CSR as justified in terms of the business case, stakeholders are of great importance to corporations. In the UK the Company Law Review (DTI, 2002) has suggested that it is appropriate for UK companies to be managed upon the basis of an enlightened shareholder approach. Within this approach the importance of stakeholders, other than shareholders, is recognised as being instrumental in succeeding in providing shareholder value. Given the importance of these other stakeholders it is then important that corporate management measure and manage stakeholder performance. In order to do this there are two general approaches that could be adopted and these are the use of monetary values to reflect stakeholder value or cost and non-monetary values. In order to consider these approaches further this paper considered the possible use of these approaches for two stakeholder groups: namely employees and the environment. It concludes that there are ethical and practical difficulties with calculating economic values for stakeholder resources and so prefers a multi-dimensional approach to stakeholder performance measurement that does not use economic valuation.
Resumo:
Digital image processing is exploited in many diverse applications but the size of digital images places excessive demands on current storage and transmission technology. Image data compression is required to permit further use of digital image processing. Conventional image compression techniques based on statistical analysis have reached a saturation level so it is necessary to explore more radical methods. This thesis is concerned with novel methods, based on the use of fractals, for achieving significant compression of image data within reasonable processing time without introducing excessive distortion. Images are modelled as fractal data and this model is exploited directly by compression schemes. The validity of this is demonstrated by showing that the fractal complexity measure of fractal dimension is an excellent predictor of image compressibility. A method of fractal waveform coding is developed which has low computational demands and performs better than conventional waveform coding methods such as PCM and DPCM. Fractal techniques based on the use of space-filling curves are developed as a mechanism for hierarchical application of conventional techniques. Two particular applications are highlighted: the re-ordering of data during image scanning and the mapping of multi-dimensional data to one dimension. It is shown that there are many possible space-filling curves which may be used to scan images and that selection of an optimum curve leads to significantly improved data compression. The multi-dimensional mapping property of space-filling curves is used to speed up substantially the lookup process in vector quantisation. Iterated function systems are compared with vector quantisers and the computational complexity or iterated function system encoding is also reduced by using the efficient matching algcnithms identified for vector quantisers.
Resumo:
Material processing using high-intensity femtosecond (fs) laser pulses is a fast developing technology holding potential for direct writing of multi-dimensional optical structures in transparent media. In this work we re-examine nonlinear diffraction theory in context of fs laser processing of silica in sub-critical (input power less than the critical power of self-focusing) regime. We have applied well known theory, developed by Vlasov, Petrishev and Talanov, that gives analytical description of the evolution of a root-mean-square beam (not necessarily Gaussian) width RRMS(z) in medium with the Kerr nonlinearity.
Resumo:
Investigations into the modelling techniques that depict the transport of discrete phases (gas bubbles or solid particles) and model biochemical reactions in a bubble column reactor are discussed here. The mixture model was used to calculate gas-liquid, solid-liquid and gasliquid-solid interactions. Multiphase flow is a difficult phenomenon to capture, particularly in bubble columns where the major driving force is caused by the injection of gas bubbles. The gas bubbles cause a large density difference to occur that results in transient multi-dimensional fluid motion. Standard design procedures do not account for the transient motion, due to the simplifying assumptions of steady plug flow. Computational fluid dynamics (CFD) can assist in expanding the understanding of complex flows in bubble columns by characterising the flow phenomena for many geometrical configurations. Therefore, CFD has a role in the education of chemical and biochemical engineers, providing the examples of flow phenomena that many engineers may not experience, even through experimentation. The performance of the mixture model was investigated for three domains (plane, rectangular and cylindrical) and three flow models (laminar, k-e turbulence and the Reynolds stresses). mThis investigation raised many questions about how gas-liquid interactions are captured numerically. To answer some of these questions the analogy between thermal convection in a cavity and gas-liquid flow in bubble columns was invoked. This involved modelling the buoyant motion of air in a narrow cavity for a number of turbulence schemes. The difference in density was caused by a temperature gradient that acted across the width of the cavity. Multiple vortices were obtained when the Reynolds stresses were utilised with the addition of a basic flow profile after each time step. To implement the three-phase models an alternative mixture model was developed and compared against a commercially available mixture model for three turbulence schemes. The scheme where just the Reynolds stresses model was employed, predicted the transient motion of the fluids quite well for both mixture models. Solid-liquid and then alternative formulations of gas-liquid-solid model were compared against one another. The alternative form of the mixture model was found to perform particularly well for both gas and solid phase transport when calculating two and three-phase flow. The improvement in the solutions obtained was a result of the inclusion of the Reynolds stresses model and differences in the mixture models employed. The differences between the alternative mixture models were found in the volume fraction equation (flux and deviatoric stress tensor terms) and the viscosity formulation for the mixture phase.
Resumo:
This work examines prosody modelling for the Standard Yorùbá (SY) language in the context of computer text-to-speech synthesis applications. The thesis of this research is that it is possible to develop a practical prosody model by using appropriate computational tools and techniques which combines acoustic data with an encoding of the phonological and phonetic knowledge provided by experts. Our prosody model is conceptualised around a modular holistic framework. The framework is implemented using the Relational Tree (R-Tree) techniques (Ehrich and Foith, 1976). R-Tree is a sophisticated data structure that provides a multi-dimensional description of a waveform. A Skeletal Tree (S-Tree) is first generated using algorithms based on the tone phonological rules of SY. Subsequent steps update the S-Tree by computing the numerical values of the prosody dimensions. To implement the intonation dimension, fuzzy control rules where developed based on data from native speakers of Yorùbá. The Classification And Regression Tree (CART) and the Fuzzy Decision Tree (FDT) techniques were tested in modelling the duration dimension. The FDT was selected based on its better performance. An important feature of our R-Tree framework is its flexibility in that it facilitates the independent implementation of the different dimensions of prosody, i.e. duration and intonation, using different techniques and their subsequent integration. Our approach provides us with a flexible and extendible model that can also be used to implement, study and explain the theory behind aspects of the phenomena observed in speech prosody.