27 resultados para Semi-Empirical Methods
Resumo:
Aluminium (Al) is known to be neurotoxic and has been associated with the aetiology of Alzheimer's Disease. To date, only desferrioxamine (DFO), a trihydroxamic acid siderophore has been used in the clinical environment for the removal of Al from the body. However, this drug is expensive, orally inactive and is associated with many side effects. These studies employed a theoretical approach, with the use of quantum mechanics (QM) via semi-empirical molecular orbital (MO) calculations, and a practical approach using U87-MG glioblastoma cells as a model for evaluating the influence of potential chelators on the passage of aluminium into cells. Preliminary studies involving the Cambridge Structural Database (CSD) identified that Al prefers binding to bidentate ligands in a 3:1 manner, whereby oxygen was the exclusive donating atom. Statistically significant differences in M-O bond lengths when compared to other trivalent metal ions such as Fe3+ were established and used as an acceptance criterion for subsequent MO calculations. Of the semi-empirical methods parameterised for Al, the PM3 Hamiltonian was found to give the most reliable final optimised geometries of simple 3:1 Al complexes. Consequently the PM3 Hamiltonian was used for evaluating the Hf of 3:1 complexes with more complicated ligands. No correlation exists between published stability constants and individual parameters calculated via PM3 optimisations, although investigation of the dicarboxylates reveals a correlation of 0.961 showing promise for affinity prediction of closely related ligands. A simple and inexpensive morin spectrofluorescence assay has been developed and optimised producing results comparable to atomic absorption spectroscopy methods for the quantitative analysis of Al. This assay was used in subsequent in vitro models, initially on E. coli, which indicated that Al inhibits the antimicrobial action of ciprofloxacin, a potent quinolone antibiotic. Ensuing studies using the second model, U87-MG cells, investigated the influence of chelators on the transmembrane transport of Al, identifying 1,2-diethylhydroxypyridin-4-one as a ligand showing greatest potential for chelating Al in the clinical situation. In conclusion, these studies have explored semi-empirical MO Hamiltonians and an in-vitro U87-MG cell line, both as possible methods for predicting effective chelators of Al.
Resumo:
Purpose – The purpose of the paper was to conduct an empirical investigation to explore the impact of project management maturity models (PMMMs) on improving project performance. Design/methodology/approach – The investigation used a cross-case analysis involving over 90 individuals in seven organisations. Findings – The findings of the empirical investigation indicate that PMMMs demonstrate very high levels of variability in individual's assessment of project management maturity. Furthermore, at higher levels of maturity, the type of performance improvement adopted following their application is related to the type of PMMM used in the assessment. The paradox of the unreliability of PMMMs and their widespread acceptance is resolved by calling upon the “wisdom of crowds” phenomenon which has implications for the use of maturity model assessments in other arena. Research limitations/implications – The investigation does have the usual issues associated with case research, but the steps that have been taken in the cross-case construction and analysis have improved the overall robustness and extendibility of the findings. Practical implications – The tendency for PMMMs to shape improvements based on their own inherent structure needs further understanding. Originality/value – The use of empirical methods to investigate the link between project maturity models and extant changes in project management performance is highly novel and the findings that result from this have added resonance.
Resumo:
Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.
Resumo:
This thesis comprises two main objectives. The first objective involved the stereochemical studies of chiral 4,6-diamino-1-aryl-1,2-dihydro-s-triazines and an investigation on how the different conformations of these stereoisomers may affect their binding affinity to the enzyme dihydrofolate reductase (DHFR). The ortho-substituted 1-aryl-1,2-dihydro-s-triazines were synthesised by the three component method. An ortho-substitution at the C6' position was observed when meta-azidocycloguanil was decomposed in acid. The ortho-substituent restricts free rotation and this gives rise to atropisomerism. Ortho-substituted 4,6-diamino-1-aryl-2-ethyl-1,2-dihydro-2-methyl-s-triazine contains two elements of chirality and therefore exists as four stereoisomers: (S,aR), (R,aS), (R,aR) and (S,aS). The energy barriers to rotation of these compounds were calculated by a semi-empirical molecular orbital program called MOPAC and they were found to be in excess of 23 kcal/mol. The diastereoisomers were resolved and enriched by C18 reversed phase h.p.l.c. Nuclear overhauser effect experiments revealed that (S,aR) and (R,aS) were the more stable pair of stereoisomers and therefore existed as the major component. The minor diastereoisomers showed greater binding affinity for the rat liver DHFR in in vitro assay. The second objective entailed the investigation into the possibility of retaining DHFR inhibitory activity by replacing the classical diamino heterocyclic moiety with an amidinyl group. 4-Benzylamino-3-nitro-N,N-dimethyl-phenylamidine was synthesised in two steps. One of the two phenylamidines indicated weak inhibition against the rat liver DHFR. This weak activity may be due to the failure of the inhibitor molecule to form strong hydrogen bonds with residue Glu-30 at the active site of the enzyme.
Resumo:
This research explores how news media reports construct representations of a business crisis through language. In an innovative approach to dealing with the vast pool of potentially relevant texts, media texts concerning the BP Deepwater Horizon oil spill are gathered from three different time points: immediately after the explosion in 2010, one year later in 2011 and again in 2012. The three sets of 'BP texts' are investigated using discourse analysis and semi-quantitative methods within a semiotic framework that gives an account of language at the semiotic levels of sign, code, mythical meaning and ideology. The research finds in the texts three discourses of representation concerning the crisis that show a movement from the ostensibly representational to the symbolic and conventional: a discourse of 'objective factuality', a discourse of 'positioning' and a discourse of 'redeployment'. This progression can be shown to have useful parallels with Peirce's sign classes of Icon, Index and Symbol, with their implied movement from a clear motivation by the Object (in this case the disaster events), to an arbitrary, socially-agreed connection. However, the naturalisation of signs, whereby ideologies are encoded in ways of speaking and writing that present them as 'taken for granted' is at its most complete when it is least discernible. The findings suggest that media coverage is likely to move on from symbolic representation to a new kind of iconicity, through a fourth discourse of 'naturalisation'. Here the representation turns back towards ostensible factuality or iconicity, to become the 'naturalised icon'. This work adds to the study of media representation a heuristic for understanding how the meaning-making of a news story progresses. It offers a detailed account of what the stages of this progression 'look like' linguistically, and suggests scope for future research into both language characteristics of phases and different news-reported phenomena.
Resumo:
Historically, recombinant membrane protein production has been a major challenge meaning that many fewer membrane protein structures have been published than those of soluble proteins. However, there has been a recent, almost exponential increase in the number of membrane protein structures being deposited in the Protein Data Bank. This suggests that empirical methods are now available that can ensure the required protein supply for these difficult targets. This review focuses on methods that are available for protein production in yeast, which is an important source of recombinant eukaryotic membrane proteins. We provide an overview of approaches to optimize the expression plasmid, host cell and culture conditions, as well as the extraction and purification of functional protein for crystallization trials in preparation for structural studies.
Resumo:
Storyline detection from news articles aims at summarizing events described under a certain news topic and revealing how those events evolve over time. It is a difficult task because it requires first the detection of events from news articles published in different time periods and then the construction of storylines by linking events into coherent news stories. Moreover, each storyline has different hierarchical structures which are dependent across epochs. Existing approaches often ignore the dependency of hierarchical structures in storyline generation. In this paper, we propose an unsupervised Bayesian model, called dynamic storyline detection model, to extract structured representations and evolution patterns of storylines. The proposed model is evaluated on a large scale news corpus. Experimental results show that our proposed model outperforms several baseline approaches.
Resumo:
The use of Diagnosis Related Groups (DRG) as a mechanism for hospital financing is a currently debated topic in Portugal. The DRG system was scheduled to be initiated by the Health Ministry of Portugal on January 1, 1990 as an instrument for the allocation of public hospital budgets funded by the National Health Service (NHS), and as a method of payment for other third party payers (e.g., Public Employees (ADSE), private insurers, etc.). Based on experience from other countries such as the United States, it was expected that implementation of this system would result in more efficient hospital resource utilisation and a more equitable distribution of hospital budgets. However, in order to minimise the potentially adverse financial impact on hospitals, the Portuguese Health Ministry decided to gradually phase in the use of the DRG system for budget allocation by using blended hospitalspecific and national DRG casemix rates. Since implementation in 1990, the percentage of each hospitals budget based on hospital specific costs was to decrease, while the percentage based on DRG casemix was to increase. This was scheduled to continue until 1995 when the plan called for allocating yearly budgets on a 50% national and 50% hospitalspecific cost basis. While all other nonNHS third party payers are currently paying based on DRGs, the adoption of DRG casemix as a National Health Service budget setting tool has been slower than anticipated. There is now some argument in both the political and academic communities as to the appropriateness of DRGs as a budget setting criterion as well as to their impact on hospital efficiency in Portugal. This paper uses a twostage procedure to assess the impact of actual DRG payment on the productivity (through its components, i.e., technological change and technical efficiency change) of diagnostic technology in Portuguese hospitals during the years 1992–1994, using both parametric and nonparametric frontier models. We find evidence that the DRG payment system does appear to have had a positive impact on productivity and technical efficiency of some commonly employed diagnostic technologies in Portugal during this time span.
Resumo:
Stereotypes of salespeople are common currency in US media outlets, and research suggests that these stereotypes are uniformly negative. However there is no reason to expect that stereotypes will be consistent across cultures. The present paper provides the first empirical examination of salesperson stereotypes in an Asian country, specifically Taiwan. Using accepted psychological methods, Taiwanese salesperson stereotypes are found to be twofold, with a negative stereotype being quite congruent with existing US stereotypes, but also a positive stereotype, which may be related to the specific culture of Taiwan.
Resumo:
In developed countries travel time savings can account for as much as 80% of the overall benefits arising from transport infrastructure and service improvements. In developing countries they are generally ignored in transport project appraisals, notwithstanding their importance. One of the reasons for ignoring these benefits in the developing countries is that there is insufficient empirical evidence to support the conventional models for valuing travel time where work patterns, particularly of the poor, are diverse and it is difficult to distinguish between work and non-work activities. The exclusion of time saving benefits may lead to a bias against investment decisions that benefit the poor and understate the poverty reduction potential of transport investments in Least Developed Countries (LDCs). This is because the poor undertake most travel and transport by walking and headloading on local roads, tracks and paths and improvements of local infrastructure and services bring large time saving benefits for them through modal shifts. The paper reports on an empirical study to develop a methodology for valuing rural travel time savings in the LDCs. Apart from identifying the theoretical and empirical issues in valuing travel time savings in the LDCs, the paper presents and discusses the results of an analysis of data from Bangladesh. Some of the study findings challenge the conventional wisdom concerning the time saving values. The Bangladesh study suggests that the western concept of dividing travel time savings into working and non-working time savings is broadly valid in the developing country context. The study validates the use of preference methods in valuing non-working time saving values. However, stated preference (SP) method is more appropriate than revealed preference (RP) method.
Resumo:
Derivational morphology proposes meaningful connections between words and is largely unrepresented in lexical databases. This thesis presents a project to enrich a lexical database with morphological links and to evaluate their contribution to disambiguation. A lexical database with sense distinctions was required. WordNet was chosen because of its free availability and widespread use. Its suitability was assessed through critical evaluation with respect to specifications and criticisms, using a transparent, extensible model. The identification of serious shortcomings suggested a portable enrichment methodology, applicable to alternative resources. Although 40% of the most frequent words are prepositions, they have been largely ignored by computational linguists, so addition of prepositions was also required. The preferred approach to morphological enrichment was to infer relations from phenomena discovered algorithmically. Both existing databases and existing algorithms can capture regular morphological relations, but cannot capture exceptions correctly; neither of them provide any semantic information. Some morphological analysis algorithms are subject to the fallacy that morphological analysis can be performed simply by segmentation. Morphological rules, grounded in observation and etymology, govern associations between and attachment of suffixes and contribute to defining the meaning of morphological relationships. Specifying character substitutions circumvents the segmentation fallacy. Morphological rules are prone to undergeneration, minimised through a variable lexical validity requirement, and overgeneration, minimised by rule reformulation and restricting monosyllabic output. Rules take into account the morphology of ancestor languages through co-occurrences of morphological patterns. Multiple rules applicable to an input suffix need their precedence established. The resistance of prefixations to segmentation has been addressed by identifying linking vowel exceptions and irregular prefixes. The automatic affix discovery algorithm applies heuristics to identify meaningful affixes and is combined with morphological rules into a hybrid model, fed only with empirical data, collected without supervision. Further algorithms apply the rules optimally to automatically pre-identified suffixes and break words into their component morphemes. To handle exceptions, stoplists were created in response to initial errors and fed back into the model through iterative development, leading to 100% precision, contestable only on lexicographic criteria. Stoplist length is minimised by special treatment of monosyllables and reformulation of rules. 96% of words and phrases are analysed. 218,802 directed derivational links have been encoded in the lexicon rather than the wordnet component of the model because the lexicon provides the optimal clustering of word senses. Both links and analyser are portable to an alternative lexicon. The evaluation uses the extended gloss overlaps disambiguation algorithm. The enriched model outperformed WordNet in terms of recall without loss of precision. Failure of all experiments to outperform disambiguation by frequency reflects on WordNet sense distinctions.
Resumo:
Data Envelopment Analysis (DEA) is a nonparametric method for measuring the efficiency of a set of decision making units such as firms or public sector agencies, first introduced into the operational research and management science literature by Charnes, Cooper, and Rhodes (CCR) [Charnes, A., Cooper, W.W., Rhodes, E., 1978. Measuring the efficiency of decision making units. European Journal of Operational Research 2, 429–444]. The original DEA models were applicable only to technologies characterized by positive inputs/outputs. In subsequent literature there have been various approaches to enable DEA to deal with negative data. In this paper, we propose a semi-oriented radial measure, which permits the presence of variables which can take both negative and positive values. The model is applied to data on a notional effluent processing system to compare the results with those yielded by two alternative methods for dealing with negative data in DEA: The modified slacks-based model suggested by Sharp et al. [Sharp, J.A., Liu, W.B., Meng, W., 2006. A modified slacks-based measure model for data envelopment analysis with ‘natural’ negative outputs and inputs. Journal of Operational Research Society 57 (11) 1–6] and the range directional model developed by Portela et al. [Portela, M.C.A.S., Thanassoulis, E., Simpson, G., 2004. A directional distance approach to deal with negative data in DEA: An application to bank branches. Journal of Operational Research Society 55 (10) 1111–1121]. A further example explores the advantages of using the new model.
Resumo:
This thesis examines the effect of rights issue announcements on stock prices by companies listed on the Kuala Lumpur Stock Exchange (KLSE) between 1987 to 1996. The emphasis is to report whether the KLSE is semi strongly efficient with respect to the announcement of rights issues and to check whether the implications of corporate finance theories on the effect of an event can be supported in the context of an emerging market. Once the effect is established, potential determinants of abnormal returns identified by previous empirical work and corporate financial theory are analysed. By examining 70 companies making clean rights issue announcements, this thesis will hopefully shed light on some important issues in long term corporate financing. Event study analysis is used to check on the efficiency of the Malaysian stock market; while cross-sectional regression analysis is executed to identify possible explanators of the rights issue announcements' effect. To ensure the results presented are not contaminated, econometric and statistical issues raised in both analyses have been taken into account. Given the small amount of empirical research conducted in this part of the world, the results of this study will hopefully be of use to investors, security analysts, corporate financial managements, regulators and policy makers as well as those who are interested in capital market based research of an emerging market. It is found that the Malaysian stock market is not semi strongly efficient since there exists a persistent non-zero abnormal return. This finding is not consistent with the hypothesis that security returns adjust rapidly to reflect new information. It may be possible that the result is influenced by the sample, consisting mainly of below average size companies which tend to be thinly traded. Nevertheless, these issues have been addressed. Another important issue which has emerged from the study is that there is some evidence to suggest that insider trading activity existed in this market. In addition to these findings, when the rights issue announcements' effect is compared to the implications of corporate finance theories in predicting the sign of abnormal returns, the signalling model, asymmetric information model, perfect substitution hypothesis and Scholes' information hypothesis cannot be supported.
Resumo:
This thesis examines the present provisions for pre-conception care and the views of the providers of services. Pre-conception care is seen by some clinicians and health educators as a means of making any necessary changes in life style, corrections to imbalances in the nutritional status of the prospective mother (and father) and the assessment of any medical problems, thus maximizing the likelihood of the normal development of the baby. Pre-conception care may be described as a service to bridge the gap between the family planning clinic and the first ante-natal booking appointment. There were three separate foci for the empirical research - the Foresight organisation (a charity which has pioneered pre-conception care in Britain); the pre-conception care clinic at the West London Hospital, Hammersmith; and the West Midlands Regional Health Authority. The six main sources of data were: twenty five clinicians operating Foresight pre-conception clinics, couples attending pre-conception clinics, committee members of the Foresight organisation, staff of the West London Hospital pre-conception clinic, Hammersmith, District Health Education Officers working in the West Midlands Regional Health Authority and the members of the Ante-Natal Care Action Group, a sub-group of the Regional Health Advisory Group on Health Promotion and Preventive Medicine. A range of research methods were adopted. These were as follows: questionnaires and report forms used in co-operation with the Foresight clinicians, interviews, participant observation discussions and informal meetings and, finally, literature and official documentation. The research findings illustrated that pre-conception care services provided at the predominantly private Foresight clinics were of a rather `ad hoc' nature. The type of provision varied considerably and clearly reflected the views held by its providers. The protocol which had been developed to assist in the standardization of results was not followed by the clinicians. The pre-conception service provided at the West London Hospital shared some similarities in its approach with the Foresight provision; a major difference was that it did not advocate the use of routine hair trace metal analysis. Interviews with District Health Education Officers and with members of the Ante Natal Care Action Group revealed a tentative and cautious approach to pre-conception care generally and to the Foresight approach in particular. The thesis concludes with a consideration of the future of pre-conception care and the prospects for the establishment of a comprehensive pre-conception care service.
Resumo:
Facilitated by an Engineer and a Social Scientist, both of whom have expertise in Engineering Education Research and Evaluation (EERE), this interactive workshop is divided into three main sections, each one focusing on a different area of evaluation. It will build on research conducted at Aston University School of Engineering and Applied Science to explore and critique the value of introducing CDIO across the first year undergraduate curriculum. Participants will be invited to consider the pedagogical and engineering related challenges of evaluating the academic and practical value of CDIO as a strategy for learning and teaching in the discipline. An empirical approach to evaluation developed by the researchers to provide empirically grounded evidence of the pedagogical and vocational value of CDIO will form the theoretical and conceptual basis of the workshop. This approach is distinctive in that it encapsulates both engineering and social science methods of evaluation. It is also contemporaneous in nature, with the researchers acting as a ‘fly on the wall’ capturing data as the programme unfolds. Through facilitated discussion and participation, the workshop will provide colleagues with the opportunity to develop a cross-disciplinary, empirically grounded research proposal specifically for the purposes of critically evaluating CDIO. It is anticipated that during the workshop, colleagues will work together in small groups. Suitable pedagogical approaches and tools will be suggested and a purposefully developed Engineering Education Research Guide, written by the workshop facilitators, will be given to all participants to inform and support the Workshop approach.