12 resultados para Indicators and Reagents.
em Aston University Research Archive
A comparison of antibiotic prescribing indicators and medicines management scoring in secondary care
Resumo:
Poster: - Robust prescribing indicators analogous to those used in primary care are not available currently in NHS hospital trusts - The Department of Health has recently implemented a scheme for self-assessment scoring medicines management processes (maximum 23) in NHS hospitals - There is no clear relationship between average values for two antibiotic prescribing indicators obtained in ten NHS hospital trusts in the West Midlands - There is no clear relationship between either indicator value and the corresponding self-assessment medicines management score - This study highlights the difficulties involved in assessing the medicines management processes in NHS hospitals; better medicines management evaluation systems are needed
Resumo:
This paper first analyses the Performance Related Pay (PRP) schemes developed from 1992/3 to 2002/3 in a large Business School in England and then the School’s mission and strategic objectives in that period. The PRP schemes changed to include more specific performance indicators and these were increasingly linked to the objectives. The School’s resources allocated to PRP increased from £44,000 in 1992/93 to £355,000 in 2002/3 and from 1.08% in 1995/96 to 2.37% of the School’s income in 2002/3. As well as examining the changing strategic objectives and PRP schemes, the paper charts the development of the School’s reputation and resources and the role which staff motivation via PRP played at different stages. The paper concludes that the PRP scheme was at its most effective when it was clearly linked with the School’s strategic objectives, but that the relationship between objectives and motivation may be more complex than apparent from this study. Although the PRP scheme under consideration also applies to academic related staff, this paper concentrates on the effect on academic staff.
Resumo:
We estimate the shape of the distribution of stock prices using data from options on the underlying asset, and test whether this distribution is distorted in a systematic manner each time a particular news event occurs. In particular we look at the response of the FTSE100 index to market wide announcements of key macroeconomic indicators and policy variables. We show that the whole distribution of stock prices can be distorted on an event day. The shift in distributional shape happens whether the event is characterized as an announcement occurrence or as a measured surprise. We find that larger surprises have proportionately greater impact, and that higher moments are more sensitive to events however characterised.
Resumo:
The subject of investigation of the present research is the use of smart hydrogels with fibre optic sensor technology. The aim was to develop a costeffective sensor platform for the detection of water in hydrocarbon media, and of dissolved inorganic analytes, namely potassium, calcium and aluminium. The fibre optic sensors in this work depend upon the use of hydrogels to either entrap chemotropic agents or to respond to external environmental changes, by changing their inherent properties, such as refractive index (RI). A review of current fibre optic technology for sensing outlined that the main principles utilised are either the measurement of signal loss or a change in wavelength of the light transmitted through the system. The signal loss principle relies on changing the conditions required for total internal reflection to occur. Hydrogels are cross-linked polymer networks that swell but do not dissolve in aqueous environments. Smart hydrogels are synthetic materials that exhibit additional properties to those inherent in their structure. In order to control the non-inherent properties, the hydrogels were fabricated with the addition of chemotropic agents. For the detection of water, hydrogels of low refractive index were synthesized using fluorinated monomers. Sulfonated monomers were used for their extreme hydrophilicity as a means of water sensing through an RI change. To enhance the sensing capability of the hydrogel, chemotropic agents, such as pH indicators and cobalt salts, were used. The system comprises of the smart hydrogel coated onto an exposed section of the fibre optic core, connected to the interrogation system measuring the difference in the signal. Information obtained was analysed using a purpose designed software. The developed sensor platform showed that an increase in the target species caused an increase in the signal lost from the sensor system, allowing for a detection of the target species. The system has potential applications in areas such as clinical point of care, water detection in fuels and the detection of dissolved ions in the water industry.
Resumo:
Supply Chain Risk Management (SCRM) has become a popular area of research and study in recent years. This can be highlighted by the number of peer reviewed articles that have appeared in academic literature. This coupled with the realisation by companies that SCRM strategies are required to mitigate the risks that they face, makes for challenging research questions in the field of risk management. The challenge that companies face today is not only to identify the types of risks that they face, but also to assess the indicators of risk that face them. This will allow them to mitigate that risk before any disruption to the supply chain occurs. The use of social network theory can aid in the identification of disruption risk. This thesis proposes the combination of social networks, behavioural risk indicators and information management, to uniquely identify disruption risk. The propositions that were developed from the literature review and exploratory case study in the aerospace OEM, in this thesis are:- By improving information flows, through the use of social networks, we can identify supply chain disruption risk. - The management of information to identify supply chain disruption risk can be explored using push and pull concepts. The propositions were further explored through four focus group sessions, two within the OEM and two within an academic setting. The literature review conducted by the researcher did not find any studies that have evaluated supply chain disruption risk management in terms of social network analysis or information management studies. The evaluation of SCRM using these methods is thought to be a unique way of understanding the issues in SCRM that practitioners face today in the aerospace industry.
Resumo:
In the last few years, significant advances have been made in understanding how a yeast cell responds to the stress of producing a recombinant protein, and how this information can be used to engineer improved host strains. The molecular biology of the expression vector, through the choice of promoter, tag and codon optimization of the target gene, is also a key determinant of a high-yielding protein production experiment. Recombinant Protein Production in Yeast: Methods and Protocols examines the process of preparation of expression vectors, transformation to generate high-yielding clones, optimization of experimental conditions to maximize yields, scale-up to bioreactor formats and disruption of yeast cells to enable the isolation of the recombinant protein prior to purification. Written in the highly successful Methods in Molecular Biology™ series format, chapters include introductions to their respective topics, lists of the necessary materials and reagents, step-by-step, readily reproducible laboratory protocols, and key tips on troubleshooting and avoiding known pitfalls.
Resumo:
Quantitative structure-activity relationship (QSAR) analysis is a cornerstone of modern informatics. Predictive computational models of peptide-major histocompatibility complex (MHC)-binding affinity based on QSAR technology have now become important components of modern computational immunovaccinology. Historically, such approaches have been built around semiqualitative, classification methods, but these are now giving way to quantitative regression methods. We review three methods--a 2D-QSAR additive-partial least squares (PLS) and a 3D-QSAR comparative molecular similarity index analysis (CoMSIA) method--which can identify the sequence dependence of peptide-binding specificity for various class I MHC alleles from the reported binding affinities (IC50) of peptide sets. The third method is an iterative self-consistent (ISC) PLS-based additive method, which is a recently developed extension to the additive method for the affinity prediction of class II peptides. The QSAR methods presented here have established themselves as immunoinformatic techniques complementary to existing methodology, useful in the quantitative prediction of binding affinity: current methods for the in silico identification of T-cell epitopes (which form the basis of many vaccines, diagnostics, and reagents) rely on the accurate computational prediction of peptide-MHC affinity. We have reviewed various human and mouse class I and class II allele models. Studied alleles comprise HLA-A*0101, HLA-A*0201, HLA-A*0202, HLA-A*0203, HLA-A*0206, HLA-A*0301, HLA-A*1101, HLA-A*3101, HLA-A*6801, HLA-A*6802, HLA-B*3501, H2-K(k), H2-K(b), H2-D(b) HLA-DRB1*0101, HLA-DRB1*0401, HLA-DRB1*0701, I-A(b), I-A(d), I-A(k), I-A(S), I-E(d), and I-E(k). In this chapter we show a step-by-step guide into predicting the reliability and the resulting models to represent an advance on existing methods. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made are freely available online at the URL http://www.jenner.ac.uk/MHCPred.
Resumo:
This paper presents a causal explanation of formative variables that unpacks and clarifies the generally accepted idea that formative indicators are ‘causes’ of the focal formative variable. In doing this, we explore the recent paper by Diamantopoulos and Temme (AMS Review, 3(3), 160-171, 2013) and show that the latter misunderstand the stance of Lee, Cadogan, and Chamberlain (AMS Review, 3(1), 3-17, 2013; see also Cadogan, Lee, and Chamberlain, AMS Review, 3(1), 38-49, 2013). By drawing on the multiple ways that one can interpret the idea of causality within the MIMIC model, we then demonstrate how the continued defense of the MIMIC model as a tool to validate formative indicators and to identify formative variables in structural models is misguided. We also present unambiguous recommendations on how formative variables can be modelled in lieu of the formative MIMIC model.
Resumo:
In this rejoinder, we provide a response to the three commentaries written by Diamantopoulos, Howell, and Rigdon (all this issue) on our paper The MIMIC Model and Formative Variables: Problems and Solutions (also this issue). We contrast the approach taken in the latter paper (where we focus on clarifying the assumptions required to reject the formative MIMIC model) by spending time discussing what assumptions would be necessary to accept the use of the formative MIMIC model as a viable approach. Importantly, we clarify the implications of entity realism and show how it is entirely logical that some theoretical constructs can be considered to have real existence independent of their indicators, and some cannot. We show how the formative model only logically holds when considering these ‘unreal’ entities. In doing so, we provide important counter-arguments for much of the criticisms made in Diamantopoulos’ commentary, and the distinction also helps clarify a number of issues in the commentaries of Howell and Rigdon (both of which in general agree with our original paper). We draw together these various threads to provide a set of conceptual tools researchers can use when thinking about the entities in their theoretical models.
Resumo:
Purpose – The purpose of this paper is to analyze the way in which the knowledge competitiveness of regions is measured and further introduces the World Knowledge Competitiveness Index (WKCI) benchmarking tool. Design/methodology/approach – The methodology consists of an econometric analysis of key indicators relating to the concept of knowledge competitiveness for 125 regions from across the globe consisting of 55 representatives from North America, 45 from Europe and 25 from Asia and Oceania. Findings – The key to winning the super competitive race in the knowledge-based economy is investment in the future: research and development, and education and training. It is found that the majority of the high-performing regional economies in the USA have a knowledge competitive edge over their counterparts in Europe and Asia. Research limitations/implications – To an extent, the research is limited by the availability of comparable indicators and metrics at the regional level that extend across the globe. Whilst comparative data are often accessible at the national level, regional data sources remain underdeveloped. Practical implications – The WKCI has become internationally recognized as an important instrument for economic development policymakers and regional investment promotion agents as they create and refine their strategies and targets. In particular, it has provided a benchmark that allows regions to compare their knowledge competitiveness with other regions for around the world and not only their own nation or continent. Originality/value – The WKCI is the first composite and relative measure of the knowledge competitiveness of the globe's best performing regions.
Resumo:
Nascent entrepreneurship and new business ownership are subsequent stages in the entrepreneurial process. We illustrate how information from the largest internationally harmonized database on entrepreneurship, the Global Entrepreneurship Monitor project, can be used to approximate the entrepreneurial process. We make a methodological contribution by computing the ratio of new business ownership to nascent entrepreneurship in a way that reflects the transition from nascent to new business ownership and provides cross-nationally comparable information on the efficiency of the entrepreneurial process for 48 countries. We report evidence for the validity of the transition ratio by benchmarking it against transition rates obtained from longitudinal studies and by correlating it with commonly used entrepreneurship indicators and macro-level economic indices. The transition ratio enables future cross-national research on the entrepreneurial process by providing a reliable and valid indicator for one key transition in this process. © 2012 Springer Science+Business Media New York.
Resumo:
Purpose: The purpose of this paper is to present the application of logical framework analysis (LFA) for implementing continuous quality improvement (CQI) across multiple settings in a tertiary care hospital. Design/methodology/approach: This study adopts a multiple case study approach. LFA is implemented within three diverse settings, namely, intensive care unit, surgical ward, and acute in-patient psychiatric ward. First, problem trees are developed in order to determine the root causes of quality issues, specific to the three settings. Second, objective trees are formed suggesting solutions to the quality issues. Third, project plan template using logical framework (LOGFRAME) is created for each setting. Findings: This study shows substantial improvement in quality across the three settings. LFA proved to be effective to analyse quality issues and suggest improvement measures objectively. Research limitations/implications: This paper applies LFA in specific, albeit, diverse settings in one hospital. For validation purposes, it would be ideal to analyse in other settings within the same hospital, as well as in several hospitals. It also adopts a bottom-up approach when this can be triangulated with other sources of data. Practical implications: LFA enables top management to obtain an integrated view of performance. It also provides a basis for further quantitative research on quality management through the identification of key performance indicators and facilitates the development of a business case for improvement. Originality/value: LFA is a novel approach for the implementation of CQI programs. Although LFA has been used extensively for project development to source funds from development banks, its application in quality improvement within healthcare projects is scant.