914 resultados para Problems faced in the classical approach
Resumo:
The ethnographic museum in the West has a long and troubling history. The display of 'exotic peoples' in travelling exhibitions began as early as the sixteenth century, but it was the mid and late nineteenth century that saw the great expansion of museums as sites to show artefacts collected - under anything but reputable circumstances - from what were considered the 'primitive', 'natural', or 'tribal' peoples of the world. Today the ethnographic museum is still a feature of large European cities, though faced with newly formulated dilemmas in the postcolonial world. For how can the material culture of a non-western people be collected and displayed in the West without its makers being translated into wordless and powerless objects of visual consumption? In national museums the processes of choosing, contextualizing and commentating exhibits help form national identity; in the ethnographic museum, similarly, they shape perceptions of the apparently distant Other. Like written ethnography, the museum is a 'translation of culture', with many of the associated problems traced by Talal Asad (1986). Like the written form, it has to represent the dialogic realities of cultural encounters in a fixed and intelligible form, to propose categories that define and order the material it has gathered. As the public face of academic ethnography, the museum interprets other cultures for the benefit of the general reader, and in that task museum practice, like all ethnography, operates within very specific historical and political parameters. How are museums in western Europe responding to the issues raised by critical ethnographers like James Clifford (1988), with their focus on the politics of representation? Is globalisation increasing the degree of accountability imposed on the ethnographic museum, or merely reinforcing older patterns? What opportunities and problems are raised by the use of more words - more 'translation' in the narrower sense - in ethnographic museums, and how do museums gain from introducing a reflexive and contextualizing concept of "thick translation" (Appiah 1993) into their work of interpretation?
Resumo:
Jackson (2005) developed a hybrid model of personality and learning, known as the learning styles profiler (LSP) which was designed to span biological, socio-cognitive, and experiential research foci of personality and learning research. The hybrid model argues that functional and dysfunctional learning outcomes can be best understood in terms of how cognitions and experiences control, discipline, and re-express the biologically based scale of sensation-seeking. In two studies with part-time workers undertaking tertiary education (N=137 and 58), established models of approach and avoidance from each of the three different research foci were compared with Jackson's hybrid model in their predictiveness of leadership, work, and university outcomes using self-report and supervisor ratings. Results showed that the hybrid model was generally optimal and, as hypothesized, that goal orientation was a mediator of sensation-seeking on outcomes (work performance, university performance, leader behaviours, and counterproductive work behaviour). Our studies suggest that the hybrid model has considerable promise as a predictor of work and educational outcomes as well as dysfunctional outcomes.
Resumo:
A content analysis examined the way majorities and minorities are represented in the British press. An analysis of the headlines of five British newspapers, over a period of five years, revealed that the words ‘majority’ and ‘minority’ appeared 658 times. Majority headlines were most frequent (66% ), more likely to emphasize the numerical size of the majority, to link majority status with political groups, to be described with positive evaluations, and to cover political issues. By contrast, minority headlines were less frequent (34%), more likely to link minority status with ethnic groups and to other social issues, and less likely to be described with positive evaluations. The implications of examining how real-life majorities and minorities are represented for our understanding of experimental research are discussed.
Resumo:
Despite recent Success, many fast-disintegrating tablets (FDTs) still face problems of low mechanical strength, poor mouth-feel and higher disintegration times. This Study aimed to optimise FDTS using a progressive three-stage approach. A series of hardness, fracturability and disintegration time tests were performed on the formulations at each stage. During Stage 1, tablets were prepared in concentrations between 2% and 5% w/w, and were formulated at each concentration as single and combination bloom strength gelatin (BSG) using 75 and 225 BSGs. Analysis revealed that both hardness and disintegration time increased with an increase in gelatin concentration. A combination (5% gelatin) FDT comprising a 50:50 ratio of 75:225 BSGs (hardness: 13.7 +/- 0.9 N and disintegration time: 24.1 +/- 0.6 s) was judged the most ideal, and was carried forward to Stage II: the addition of the saccharides sorbitol, mannitol and sucrose in concentrations between 10% and 80% w/w. The best properties were exhibited by mannitol-containing formulations (50%-hardness: 30.9 +/- 2.8 N and disintegration time: 13.3 +/- 2.1 s), which were carried forward to the next stage: the addition of viscosity-modifying polymers to improve mouth-feel and aid pre-gastric retention. Addition of carbopol 974P-NF resulted in the enhancement of viscosity with a compromise of the hardness of the tablet, whereas Pluronic F127 (6%) showed an increase in disintegration time and viscosity with retention of mechanical propel-ties. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The further development of the use of NMR relaxation times in chemical, biological and medical research has perhaps been curtailed by the length of time these measurements often take. The DESPOT (Driven Equilibrium Single Pulse Observation of T1) method has been developed, which reduces the time required to make a T1 measurement by a factor of up to 100. The technique has been studied extensively herein and the thesis contains recommendations for its successful experimental application. Modified DESPOT type equations for use when T2 relaxation is incomplete or where off-resonance effects are thought to be significant are also presented. A recently reported application of the DESPOT technique to MR imaging gave good initial results but suffered from the fact that the images were derived from spin systems that were not driven to equilibrium. An approach which allows equilibrium to be obtained with only one non-acquisition sequence is presented herein and should prove invaluable in variable contrast imaging. A DESPOT type approach has also been successfully applied to the measurement of T1. T_1's can be measured, using this approach significantly faster than by the use of the classical method. The new method also provides a value for T1 simultaneously and therefore the technique should prove valuable in intermediate energy barrier chemical exchange studies. The method also gives rise to the possibility of obtaining simultaneous T1 and T1 MR images. The DESPOT technique depends on rapid multipulsing at nutation angles, normally less than 90^o. Work in this area has highlighted the possible time saving for spectral acquisition over the classical technique (90^o-5T_1)_n. A new method based on these principles has been developed which permits the rapid multipulsing of samples to give T_1 and M_0 ratio information. The time needed, however, is only slightly longer than would be required to determine the M_0 ratio alone using the classical technique. In ^1H decoupled ^13C spectroscopy the method also gives nOe ratio information for the individual absorptions in the spectrum.
Resumo:
This paper reviews the approach to multidisciplinary and placement education in UK schools of pharmacy. The methodology involved triangulation of course documentation, staff interviews and a final year student survey. Staff members were supportive of multidisciplinary learning. The advantages were development of a wider appreciation of the students? future professional role and better understanding of the roles of other professional groups. The barriers were logistics (student numbers; multiple sites; different timetables), the achievement of balanced numbers between disciplines and engagement of students from all participating disciplines. Placement education was offered by all schools, predominantly in hospital settings. Key problems were funding and the lack of staff resources. Currently, multidisciplinary learning within the UK for pharmacy students is inadequate and is coupled with relatively low levels of placement education. In order for things to change, there should be a review of funding and support from government and the private sector employers.
Resumo:
This thesis describes a project which has investigated the evaluation of information systems. The work took place in, and is related to, a specific organisational context, that of the National Health Service (NHS). It aims to increase understanding of the evaluation which takes place in the service and the way in which this is affected by the NHS environment. It also investigates the issues which surround some important types of evaluation and their use in this context. The first stage of the project was a postal survey in which respondents were asked to describe the evaluation which took place in their authorities and to give their opinions about it. This was used to give an overview of the practice of IS evaluation in the NHS and to identify its uses and the problems experienced. Three important types of evaluation were then examined in more detail by means of action research studies. One of these dealt with the selection and purchase of a large hospital information system. The study took the form of an evaluation of the procurement process, and examined the methods used and the influence of organisational factors. The other studies are concerned with post-implementation evaluation, and examine the choice of an evaluation approach as well as its application. One was an evaluation of a community health system which had been operational for some time but was of doubtful value, and suffered from a number of problems. The situation was explored by means of a study of the costs and benefits of the system. The remaining study was the initial review of a system which was used in the administration of a Breast Screening Service. The service itself was also newly operational and the relationship between the service and the system was of interest.
Resumo:
This thesis is concerned with the use of the synoptic approach within decision making concerning nuclear waste management. The synoptic approach to decision making refers to an approach to rational decision making that assumes as an ideal, comprehensiveness of information and analysis. Two case studies are examined in which a high degree of synoptic analysis has been used within the decision making process. The case studies examined are the Windscale Inquiry into the decision to build the THORP reprocessing plant and the Nirex safety assessment of nuclear waste disposal. The case studies are used to test Lindblom's hypothesis that a synoptic approach to decision making is not achievable. In the first case study Lindblom's hypothesis is tested through the evaluation of the decision to build the THORP plant, taken following the Windscale Inquiry. It is concluded that the incongruity of this decision supports Lindblom's hypothesis. However, it has been argued that the Inquiry should be seen as a legitimisation exercise for a decision that was effectively predetermined, rather than a rigorous synoptic analysis. Therefore, the Windscale Inquiry does not provide a robust test of the synoptic method. It was concluded that a methodology was required, that allowed robust conclusions to be drawn, despite the ambiguity of the role of the synoptic method in decision making. Thus, the methodology adopted for the second case study was modified. In this case study the synoptic method was evaluated directly. This was achieved through the analysis of the cogency of the Nirex safety assessment. It was concluded that the failure of Nirex to provide a cogent synoptic analysis supported Lindblom's criticism of the synoptic method. Moreover, it was found that the synoptic method failed in the way that Lindblom predicted that it would.
Resumo:
Healthcare providers and policy makers are faced with an ever-increasing number of medical publications. Searching for relevant information and keeping up to date with new research findings remains a constant challenge. It has been widely acknowledged that narrative reviews of the literature are susceptible to several types of bias and a systematic approach may protect against these biases. The aim of this thesis was to apply quantitative methods in the assessment of outcomes of topical therapies for psoriasis. In particular, to systematically examine the comparative efficacy, tolerability and cost-effectiveness of topical calcipotriol in the treatment of mild-to-moderate psoriasis. Over the years, a wide range of techniques have been used to evaluate the severity of psoriasis and the outcomes from treatment. This lack of standardisation complicates the direct comparison of results and ultimately the pooling of outcomes from different clinical trials. There is a clear requirement for more comprehensive tools for measuring drug efficacy and disease severity in psoriasis. Ideally, the outcome measures need to be simple, relevant, practical, and widely applicable, and the instruments should be reliable, valid and responsive. The results of the meta-analysis reported herein show that calcipotriol is an effective antipsoriatic agent. In the short-tenn, the pooled data found calcipotriol to be more effective than calcitriol, tacalcitol, coal tar and short-contact dithranol. Only potent corticosteroids appeared to have comparable efficacy, with less short-term side-effects. Potent corticosteroids also added to the antipsoriatic effect of calcipotriol, and appeared to suppress the occurrence of calcipotriol-induced irritation. There was insufficient evidence to support any large effects in favour of improvements in efficacy when calcipotriol is used in combination with systemic therapies in patients with severe psoriasis. However, there was a total absence of long-term morbidity data on the effectiveness of any of the interventions studied. Decision analysis showed that, from the perspective of the NHS as payer, the relatively small differences in efficacy between calcipotriol and short-contact dithranol lead to large differences in the direct cost of treating patients with mildto-moderate plaque psoriasis. Further research is needed to examine the clinical and economic issues affecting patients under treatment for psoriasis in the UK. In particular, the maintenance value and cost/benefit ratio for the various treatment strategies, and the assessment of patient's preferences has not yet been adequately addressed for this chronic recurring disease.
Resumo:
Case studies in copper-alloy rolling mill companies showed that existing planning systems suffer from numerous shortcomings. Where computerised systems are in use, these tend to simply emulate older manual systems and still rely heavily on modification by experienced planners on the shopfloor. As the size and number of orders increase, the task of process planners, while seeking to optimise the manufacturing objectives and keep within the production constraints, becomes extremely complicated because of the number of options for mixing or splitting the orders into batches. This thesis develops a modular approach to computerisation of the production management and planning functions. The full functional specification of each module is discussed, together with practical problems associated with their phased implementation. By adapting the Distributed Bill of Material concept from Material Requirements Planning (MRP) philosophy, the production routes generated by the planning system are broken down to identify the rolling stages required. Then to optimise the use of material at each rolling stage, the system generates an optimal cutting pattern using a new algorithm that produces practical solutions to the cutting stock problem. It is shown that the proposed system can be accommodated on a micro-computer, which brings it into the reach of typical companies in the copper-alloy rolling industry, where profit margins are traditionally low and the cost of widespread use of mainframe computers would be prohibitive.
Resumo:
A history of government drug regulation and the relationship between the pharmaceutical companies in the U.K. and the licensing authority is outlined. Phases of regulatory stringency are identified with the formation of the Committees on Safety of Drugs and Medicines viewed as watersheds. A study of the impact of government regulation on industrial R&D activities focuses on the effects on the rate and direction of new product innovation. A literature review examines the decline in new chemical entity innovation. Regulations are cited as a major but not singular cause of the decline. Previous research attempting to determine the causes of such a decline on an empirical basis is given and the methodological problems associated with such research are identified. The U.K. owned sector of the British pharmaceutical industry is selected for a study employing a bottom-up approach allowing disaggregation of data. A historical background to the industry is provided, with each company analysed or a case study basis. Variations between companies regarding the policies adopted for R&D are emphasised. The process of drug innovation is described in order to determine possible indicators of the rate and direction of inventive and innovative activity. All possible indicators are considered and their suitability assessed. R&D expenditure data for the period 1960-1983 is subsequently presented as an input indicator. Intermediate output indicators are treated in a similar way and patent data are identified as a readily-available and useful source. The advantages and disadvantages of using such data are considered. Using interview material, patenting policies for most of the U.K. companies are described providing a background for a patent-based study. Sources of patent data are examined with an emphasis on computerised systems. A number of searches using a variety of sources are presented. Patent family size is examined as a possible indicator of an invention's relative importance. The patenting activity of the companies over the period 1960-1983 is given and the variation between companies is noted. The relationship between patent data and other indicators used is analysed using statistical methods resulting in an apparent lack of correlation. An alternative approach taking into account variations in company policy and phases in research activity indicates a stronger relationship between patenting activity, R&D Expenditure and NCE output over the period. The relationship is not apparent at an aggregated company level. Some evidence is presented for a relationship between phases of regulatory stringency, inventive and innovative activity but the importance of other factors is emphasised.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT