70 resultados para Framework Model

em Aston University Research Archive


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Coronary heart disease (CHD) is a public health priority in the UK. The National Service Framework (NSF) has set standards for the prevention, diagnosis and treatment of CHD, which include the use of cholesterol-lowering agents aimed at achieving targets of blood total cholesterol (TC) < 5.0 mmol/L and low density lipoprotein-cholesterol (LDL-C) < 3.0 mmol/L. In order to achieve these targets cost effectively, prescribers need to make an informed choice from the range of statins available. Aim: To estimate the average and relative cost effectiveness of atorvastatin, fluvastatin, pravastatin and simvastatin in achieving the NSF LDL-C and TC targets. Design: Model-based economic evaluation. Methods: An economic model was constructed to estimate the number of patients achieving the NSF targets for LDL-C and TC at each dose of statin, and to calculate the average drug cost and incremental drug cost per patient achieving the target levels. The population baseline LDL-C and TC, and drug efficacy and drug costs were taken from previously published data. Estimates of the distribution of patients receiving each dose of statin were derived from the UK national DIN-LINK database. Results: The estimated annual drug cost per 1000 patients treated with atorvastatin was £289 000, with simvastatin £315 000, with pravastatin £333 000 and with fluvastatin £167 000. The percentages of patients achieving target are 74.4%, 46.4%, 28.4% and 13.2% for atorvastatin, simvastatin, pravastatin and fluvastatin, respectively. Incremental drug cost per extra patient treated to LDL-C and TC targets compared with fluvastafin were £198 and £226 for atorvastatin, £443 and £567 for simvastatin and £1089 and £2298 for pravastatin, using 2002 drug costs. Conclusions: As a result of its superior efficacy, atorvastatin generates a favourable cost-effectiveness profile as measured by drug cost per patient treated to LDL-C and TC targets. For a given drug budget, more patients would achieve NSF LDL-C and TC targets with atorvastatin than with any of the other statins examined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ERS-1 satellite carries a scatterometer which measures the amount of radiation scattered back toward the satellite by the ocean's surface. These measurements can be used to infer wind vectors. The implementation of a neural network based forward model which maps wind vectors to radar backscatter is addressed. Input noise cannot be neglected. To account for this noise, a Bayesian framework is adopted. However, Markov Chain Monte Carlo sampling is too computationally expensive. Instead, gradient information is used with a non-linear optimisation algorithm to find the maximum em a posteriori probability values of the unknown variables. The resulting models are shown to compare well with the current operational model when visualised in the target space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Letter addresses image segmentation via a generative model approach. A Bayesian network (BNT) in the space of dyadic wavelet transform coefficients is introduced to model texture images. The model is similar to a Hidden Markov model (HMM), but with non-stationary transitive conditional probability distributions. It is composed of discrete hidden variables and observable Gaussian outputs for wavelet coefficients. In particular, the Gabor wavelet transform is considered. The introduced model is compared with the simplest joint Gaussian probabilistic model for Gabor wavelet coefficients for several textures from the Brodatz album [1]. The comparison is based on cross-validation and includes probabilistic model ensembles instead of single models. In addition, the robustness of the models to cope with additive Gaussian noise is investigated. We further study the feasibility of the introduced generative model for image segmentation in the novelty detection framework [2]. Two examples are considered: (i) sea surface pollution detection from intensity images and (ii) image segmentation of the still images with varying illumination across the scene.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ERS-1 satellite carries a scatterometer which measures the amount of radiation scattered back toward the satellite by the ocean's surface. These measurements can be used to infer wind vectors. The implementation of a neural network based forward model which maps wind vectors to radar backscatter is addressed. Input noise cannot be neglected. To account for this noise, a Bayesian framework is adopted. However, Markov Chain Monte Carlo sampling is too computationally expensive. Instead, gradient information is used with a non-linear optimisation algorithm to find the maximum em a posteriori probability values of the unknown variables. The resulting models are shown to compare well with the current operational model when visualised in the target space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

How do signals from the 2 eyes combine and interact? Our recent work has challenged earlier schemes in which monocular contrast signals are subject to square-law transduction followed by summation across eyes and binocular gain control. Much more successful was a new 'two-stage' model in which the initial transducer was almost linear and contrast gain control occurred both pre- and post-binocular summation. Here we extend that work by: (i) exploring the two-dimensional stimulus space (defined by left- and right-eye contrasts) more thoroughly, and (ii) performing contrast discrimination and contrast matching tasks for the same stimuli. Twenty-five base-stimuli made from 1 c/deg patches of horizontal grating, were defined by the factorial combination of 5 contrasts for the left eye (0.3-32%) with five contrasts for the right eye (0.3-32%). Other than in contrast, the gratings in the two eyes were identical. In a 2IFC discrimination task, the base-stimuli were masks (pedestals), where the contrast increment was presented to one eye only. In a matching task, the base-stimuli were standards to which observers matched the contrast of either a monocular or binocular test grating. In the model, discrimination depends on the local gradient of the observer's internal contrast-response function, while matching equates the magnitude (rather than gradient) of response to the test and standard. With all model parameters fixed by previous work, the two-stage model successfully predicted both the discrimination and the matching data and was much more successful than linear or quadratic binocular summation models. These results show that performance measures and perception (contrast discrimination and contrast matching) can be understood in the same theoretical framework for binocular contrast vision. © 2007 VSP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Innovation is part and parcel of any service in today's environment, so as to remain competitive. Quality improvement in healthcare services is a complex, multi-dimensional task. This study proposes innovation management in healthcare services using a logical framework. A problem tree and an objective tree are developed to identify and mitigate issues and concerns. A logical framework is formulated to develop a plan for implementation and monitoring strategies, potentially creating an environment for continuous quality improvement in a specific unit. We recommend logical framework as a valuable model for innovation management in healthcare services. Copyright © 2006 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to develop an integrated quality management model that identifies problems, suggests solutions, develops a framework for implementation and helps to evaluate dynamically healthcare service performance. Design/methodology/approach - This study used the logical framework analysis (LFA) to improve the performance of healthcare service processes. LFA has three major steps - problems identification, solution derivation, and formation of a planning matrix for implementation. LFA has been applied in a case-study environment to three acute healthcare services (Operating Room utilisation, Accident and Emergency, and Intensive Care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This study shows LFA application in three service processes in one hospital. This very limited population sample needs to be extended. Practical implications - The proposed model can be implemented in hospital-based healthcare services in order to improve performance. It may also be applied to other services. Originality/value - Quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in healthcare delivery, they are not without flaws. There is an absence of an integrated approach, which can identify and analyse issues, provide solutions to resolve those issues, develop a project management framework to implement those solutions. This study introduces an integrated and uniform quality management tool for healthcare services. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with the ways in which transactional and transformational leadership styles can improve the service performance of front-line staff. Past literature on services marketing has indicated the importance of leadership but has largely ignored the parallel literature in which leadership styles have been conceptualized and operationalized (e.g., sales management, organizational psychology). This paper seeks to build upon existing services marketing theory by introducing the role of leadership styles in enhancing service performance. Consequently, a conceptual framework of the effect of transactional and transformational leadership styles on service performance, anchored in a crossdisciplinary literature review, is developed. Managerial implications and future research directions are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with the effects that leadership styles (i.e., transactional and transformational) can have upon the level of front-line employees’ service delivery quality. Previous literature has mostly looked at leadership and its effects upon subordinates within a sales, psychology, or human resources context. However, due to the idiosyncrasies inherent in services (i.e., intangibility, heterogeneity, perishability, and inseparability), it is likely that, in such a context, different leadership styles will effect performance outcomes. Consequently, this paper seeks to expand the services marketing literature by developing a conceptual framework of leadership style effects adapted to the field of services marketing. Of particular importance are the effects that leadership styles have upon front-line employee “motivators” and service-related job outcomes. Specific hypotheses are developed and future research directions are also presented for consideration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article proposes a framework of alternative international marketing strategies, based on the evaluation of intra- and inter-cultural behavioural homogeneity for market segmentation. The framework developed in this study provides a generic structure to behavioural homogeneity, proposing consumer involvement as a construct with unique predictive ability for international marketing strategy decisions. A model-based segmentation process, using structural equation models, is implemented to illustrate the application of the framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the UK, low vision rehabilitation is delivered by a wide variety of providers with different strategies being used to integrate services from health, social care and the voluntary sector. In order to capture the current diversity of service provision the Low vision Service Model Evaluation (LOVSME) project aimed to profile selected low vision services using published standards for service delivery as a guide. Seven geographically and organizationally varied low-vision services across England were chosen for their diversity and all agreed to participate. A series of questionnaires and follow-up visits were undertaken to obtain a comprehensive description of each service, including the staff workloads and the cost of providing the service. In this paper the strengths of each model of delivery are discussed, and examples of good practice identified. As a result of the project, an Assessment Framework tool has been developed that aims to help other service providers evaluate different aspects of their own service to identify any gaps in existing service provision, and will act as a benchmark for future service development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When constructing and using environmental models, it is typical that many of the inputs to the models will not be known perfectly. In some cases, it will be possible to make observations, or occasionally physics-based uncertainty propagation, to ascertain the uncertainty on these inputs. However, such observations are often either not available or even possible, and another approach to characterising the uncertainty on the inputs must be sought. Even when observations are available, if the analysis is being carried out within a Bayesian framework then prior distributions will have to be specified. One option for gathering or at least estimating this information is to employ expert elicitation. Expert elicitation is well studied within statistics and psychology and involves the assessment of the beliefs of a group of experts about an uncertain quantity, (for example an input / parameter within a model), typically in terms of obtaining a probability distribution. One of the challenges in expert elicitation is to minimise the biases that might enter into the judgements made by the individual experts, and then to come to a consensus decision within the group of experts. Effort is made in the elicitation exercise to prevent biases clouding the judgements through well-devised questioning schemes. It is also important that, when reaching a consensus, the experts are exposed to the knowledge of the others in the group. Within the FP7 UncertWeb project (http://www.uncertweb.org/), there is a requirement to build a Webbased tool for expert elicitation. In this paper, we discuss some of the issues of building a Web-based elicitation system - both the technological aspects and the statistical and scientific issues. In particular, we demonstrate two tools: a Web-based system for the elicitation of continuous random variables and a system designed to elicit uncertainty about categorical random variables in the setting of landcover classification uncertainty. The first of these examples is a generic tool developed to elicit uncertainty about univariate continuous random variables. It is designed to be used within an application context and extends the existing SHELF method, adding a web interface and access to metadata. The tool is developed so that it can be readily integrated with environmental models exposed as web services. The second example was developed for the TREES-3 initiative which monitors tropical landcover change through ground-truthing at confluence points. It allows experts to validate the accuracy of automated landcover classifications using site-specific imagery and local knowledge. Experts may provide uncertainty information at various levels: from a general rating of their confidence in a site validation to a numerical ranking of the possible landcover types within a segment. A key challenge in the web based setting is the design of the user interface and the method of interacting between the problem owner and the problem experts. We show the workflow of the elicitation tool, and show how we can represent the final elicited distributions and confusion matrices using UncertML, ready for integration into uncertainty enabled workflows.We also show how the metadata associated with the elicitation exercise is captured and can be referenced from the elicited result, providing crucial lineage information and thus traceability in the decision making process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The potential of social marketing has been recognized in the United Kingdom by the Department for Environment, Food and Rural Affairs (DEFRA) as a useful tool for behavioral change for environmental problems. The techniques of social marketing have been used successfully by health organizations to tackle current public health issues. This article describes a research project which explored the current barriers to recycling household waste and the development of a segmentation model which could be used at the local level by authorities charged with waste collection and disposal. The research makes a unique contribution to social marketing through the introduction of a competencies framework and market segmentation for recycling behaviors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work examines prosody modelling for the Standard Yorùbá (SY) language in the context of computer text-to-speech synthesis applications. The thesis of this research is that it is possible to develop a practical prosody model by using appropriate computational tools and techniques which combines acoustic data with an encoding of the phonological and phonetic knowledge provided by experts. Our prosody model is conceptualised around a modular holistic framework. The framework is implemented using the Relational Tree (R-Tree) techniques (Ehrich and Foith, 1976). R-Tree is a sophisticated data structure that provides a multi-dimensional description of a waveform. A Skeletal Tree (S-Tree) is first generated using algorithms based on the tone phonological rules of SY. Subsequent steps update the S-Tree by computing the numerical values of the prosody dimensions. To implement the intonation dimension, fuzzy control rules where developed based on data from native speakers of Yorùbá. The Classification And Regression Tree (CART) and the Fuzzy Decision Tree (FDT) techniques were tested in modelling the duration dimension. The FDT was selected based on its better performance. An important feature of our R-Tree framework is its flexibility in that it facilitates the independent implementation of the different dimensions of prosody, i.e. duration and intonation, using different techniques and their subsequent integration. Our approach provides us with a flexible and extendible model that can also be used to implement, study and explain the theory behind aspects of the phenomena observed in speech prosody.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research investigates the processes of adoption and implementation, by organisations, of computer aided production management systems (CAPM). It is organised around two different theoretical perspectives. The first part is informed by the Rogers model of the diffusion, adoption and implementation of innovations, and the second part by a social constructionist approach to technology. Rogers' work is critically evaluated and a model of adoption and implementation is distilled from it and applied to a set of empirical case studies. In the light of the case study data, strengths and weaknesses of the model are identified. It is argued that the model is too rational and linear to provide an adequate explanation of adoption processes. It is useful for understanding processes of implementation but requires further development. The model is not able to adequately encompass complex computer based technologies. However, the idea of 'reinvention' is identified as Roger's key concept but it needs to be conceptually extended. Both Roger's model and definition of CAPM found in the literature from production engineering tend to treat CAPM in objectivist terms. The problems with this view are addressed through a review of the literature on the sociology of technology, and it is argued that a social constructionist approach offers a more useful framework for understanding CAPM, its nature, adoption, implementation, and use. CAPM it is argued, must be understood on terms of the ways in which it is constituted in discourse, as part of a 'struggle for meaning' on the part of academics, professional engineers, suppliers, and users.