35 resultados para Teaching Evaluation and Assessment
Resumo:
Formulation of solid dispersions is one of the effective methods to increase the rate of solubilization and dissolution of poorly soluble drugs. Solid dispersions of chloramphenicol (CP) and sulphamethoxazole (SX) as model drugs were prepared by melt fusion method using polyethylene glycol 8000 (PEG 8000) as an inert carrier. The dissolution rate of CP and SX were rapid from solid dispersions with low drug and high polymer content. Characterization was performed using fourier transform infrared spectroscopy (FTIR), differential scanning calorimetry (DSC) and scanning electron microscopy (SEM). FTIR analysis for the solid dispersions of CP and SX showed that there was no interaction between PEG 8000 and the drugs. Hyper-DSC studies revealed that CP and SX were converted into an amorphous form when formulated as solid dispersion in PEG 8000. Mathematical analysis of the release kinetics demonstrated that drug release from the various formulations followed different mechanisms. Permeability studies demonstrated that both CP and SX when formulated as solid dispersions showed enhanced permeability across Caco-2 cells and CP can be classified as well-absorbed compound when formulated as solid dispersions. © 2013 Informa Healthcare USA, Inc.
Resumo:
The objective of this study has been to enable a greater understanding of the biomass gasification process through the development and use of process and economic models. A new theoretical equilibrium model of gasification is described using the operating condition called the adiabatic carbon boundary. This represents an ideal gasifier working at the point where the carbon in the feedstock is completely gasified. The model can be used as a `target' against which the results of real gasifiers can be compared, but it does not simulate the results of real gasifiers. A second model has been developed which uses a stagewise approach in order to model fluid bed gasification, and its results have indicated that pyrolysis and the reactions of pyrolysis products play an important part in fluid bed gasifiers. Both models have been used in sensitivity analyses: the biomass moisture content and gasifying agent composition were found to have the largest effects on performance, whilst pressure and heat loss had lesser effects. Correlations have been produced to estimate the total installed capital cost of gasification systems and have been used in an economic model of gasification. This has been used in a sensitivity analysis to determine the factors which most affect the profitability of gasification. The most important influences on gasifier profitability have been found to be feedstock cost, product selling price and throughput. Given the economic conditions of late 1985, refuse gasification for the production of producer gas was found to be viable at throughputs of about 2.5 tonnes/h dry basis and above, in the metropolitan counties of the United Kingdom. At this throughput and above, the largest element of product gas cost is the feedstock cost, the cost element which is most variable.
Resumo:
This thesis describes the development of a complete data visualisation system for large tabular databases, such as those commonly found in a business environment. A state-of-the-art 'cyberspace cell' data visualisation technique was investigated and a powerful visualisation system using it was implemented. Although allowing databases to be explored and conclusions drawn, it had several drawbacks, the majority of which were due to the three-dimensional nature of the visualisation. A novel two-dimensional generic visualisation system, known as MADEN, was then developed and implemented, based upon a 2-D matrix of 'density plots'. MADEN allows an entire high-dimensional database to be visualised in one window, while permitting close analysis in 'enlargement' windows. Selections of records can be made and examined, and dependencies between fields can be investigated in detail. MADEN was used as a tool for investigating and assessing many data processing algorithms, firstly data-reducing (clustering) methods, then dimensionality-reducing techniques. These included a new 'directed' form of principal components analysis, several novel applications of artificial neural networks, and discriminant analysis techniques which illustrated how groups within a database can be separated. To illustrate the power of the system, MADEN was used to explore customer databases from two financial institutions, resulting in a number of discoveries which would be of interest to a marketing manager. Finally, the database of results from the 1992 UK Research Assessment Exercise was analysed. Using MADEN allowed both universities and disciplines to be graphically compared, and supplied some startling revelations, including empirical evidence of the 'Oxbridge factor'.
Resumo:
The investigation of renal pathophysiology and toxicology has traditionally been advanced by the development of increasingly defined and refined in vitro preparations. This study has sought to develop and evaluate various methods of producing pure samples of renal proximal tubules (PTs) from the Fischer rat. The introduction summarised the most common in vitro preparations together with the parameters used to monitor viability - particularly with regard to toxic events. The most prevalent isolation methods have involved the use of collagenase to produce dissociation of the cortex. However, the present study has shown that even the mildest collagenase treatment caused significant structural damage which resulted in a longevity of only 3hr in suspension. An alternative mechanical isolation technique has been developed in this study that consists of perfusion loading the renal glomeruli with Fe304 followed by disruption of the cortex by homogenisation and sequential sieving. The glomeruli are removed magnetically and the PTs then harvested by a 64μM sieve. PTs isolated in this way showed a vastly superior structural preservation over their collagenase isolated counterparts; also oxygen consumption and enzyme leakage measurements showed a longevity in excess of 6hr when incubated in a very basic medium. Attempts were then made to measure the cytosolic calcium levels in both mechanical and collagenase isolated PTs using the fluorescent calcium indicator Fura. However results were inconclusive due to significant binding of the Fura to the external PT surfaces. In conclusion, PTs prepared by the present mechanical isolation technique exhibit superior preservation and longevity compared with even the mildest collagenase isolation technique and hence appear to offer potential advantages over collagenase isolation as an in vitro renal system.
Resumo:
This study is concerned with several proposals concerning multiprocessor systems and with the various possible methods of evaluating such proposals. After a discussion of the advantages and disadvantages of several performance evaluation tools, the author decides that simulation is the only tool powerful enough to develop a model which would be of practical use, in the design, comparison and extension of systems. The main aims of the simulation package developed as part of this study are cost effectiveness, ease of use and generality. The methodology on which the simulation package is based is described in detail. The fundamental principles are that model design should reflect actual systems design, that measuring procedures should be carried out alongside design that models should be well documented and easily adaptable and that models should be dynamic. The simulation package itself is modular, and in this way reflects current design trends. This approach also aids documentation and ensures that the model is easily adaptable. It contains a skeleton structure and a library of segments which can be added to or directly swapped with segments of the skeleton structure, to form a model which fits a user's requirements. The study also contains the results of some experimental work carried out using the model, the first part of which tests• the model's capabilities by simulating a large operating system, the ICL George 3 system; the second part deals with general questions and some of the many proposals concerning multiprocessor systems.
Resumo:
The research comprises a suite of studies that examines and develops the Lead Authority Partnership Scheme (LAPS) as a central intervention strategy for health and safety by local authority (LA) enforcers. Partnership working is a regulatory concept that in recent years has become more popular but there has been little research conducted to investigate, explore and evaluate its practical application. The study reviewed two contrasting approaches to partnership working between LAs and businesses, both of which were intended to secure improvements in the consistency of enforcement by the regulators and in the health and safety management systems of the participating businesses. The first was a well-established and highly prescriptive approach that required a substantial resource commitment on the part of the LA responsible for conducting a safety management review (SMR) of the business. As a result of his evaluation of the existing ‘full SMR’ scheme, the author developed a second, more flexible approach to partnership working. The research framework was based upon a primarily qualitative methodology intended to investigate and explore the impact of the new flexible arrangements for partnership working. The findings from this study of the flexible development of the scheme were compared and contrasted with those from studies of the established ‘full SMR’ scheme. A substantial degree of triangulation was applied in an attempt to strengthen validity and broaden applicability of the research findings. Key informant interviews, participant observation, document/archive reviews, questionnaires and surveys all their particular part to play in the overall study. The findings from this research revealed that LAPS failed to deliver consistency of LA enforcement across multiple-outlet businesses and the LA enforced business sectors. Improvement was however apparent in the safety management systems of the businesses participating in LAPS. Trust between LA inspector and safety professional was key to the success of the partnerships as was the commitment of these key individuals. Competition for precious LA resources, the priority afforded to food safety over health and safety, the perceived high resource demands of LAPS, and the structure and culture of LAs were identified as significant barriers to LA participation. Flexible approaches, whilst addressing the resource issues, introduced some fresh concerns relating to credibility and delivery. Over and above the stated aims of the scheme, LAs and businesses had their own reasons for participation, notably the personal development of individuals and kudos for the organisation. The research has explored the wider implications for partnership working with the overall conclusion it is most appropriately seen as a strategic level element within a broader structured intervention strategy.
Resumo:
In 1974 Dr D M Bramwell published his research work at the University of Aston a part of which was the establishment of an elemental work study data base covering drainage construction. The Transport and Road Research Laboratory decided to, extend that work as part of their continuing research programme into the design and construction of buried pipelines by placing a research contract with Bryant Construction. This research may be considered under two broad categories. In the first, site studies were undertaken to validate and extend the data base. The studies showed good agreement with the existing data with the exception of the excavation trench shoring and pipelaying data which was amended to incorporate new construction plant and methods. An inter-active on-line computer system for drainage estimating was developed. This system stores the elemental data, synthesizes the standard time of each drainage operation and is used to determine the required resources and construction method of the total drainage activity. The remainder of the research was into the general topic of construction efficiency. An on-line command driven computer system was produced. This system uses a stochastic simulation technique, based on distributions of site efficiency measurements to evaluate the effects of varying performance levels. The analysis of this performance data quantities the variability inherent in construction and demonstrates how some of this variability can be reconciled by considering the characteristics of a contract. A long term trend of decreasing efficiency with contract duration was also identified. The results obtained from the simulation suite were compared to site records collected from current contracts. This showed that this approach will give comparable answers, but these are greatly affected by the site performance parameters.
Resumo:
We discuss aggregation of data from neuropsychological patients and the process of evaluating models using data from a series of patients. We argue that aggregation can be misleading but not aggregating can also result in information loss. The basis for combining data needs to be theoretically defined, and the particular method of aggregation depends on the theoretical question and characteristics of the data. We present examples, often drawn from our own research, to illustrate these points. We also argue that statistical models and formal methods of model selection are a useful way to test theoretical accounts using data from several patients in multiple-case studies or case series. Statistical models can often measure fit in a way that explicitly captures what a theory allows; the parameter values that result from model fitting often measure theoretically important dimensions and can lead to more constrained theories or new predictions; and model selection allows the strength of evidence for models to be quantified without forcing this into the artificial binary choice that characterizes hypothesis testing methods. Methods that aggregate and then formally model patient data, however, are not automatically preferred to other methods. Which method is preferred depends on the question to be addressed, characteristics of the data, and practical issues like availability of suitable patients, but case series, multiple-case studies, single-case studies, statistical models, and process models should be complementary methods when guided by theory development.
Resumo:
Introduction-The design of the UK MPharm curriculum is driven by the Royal Pharmaceutical Society of Great Britain (RPSGB) accreditation process and the EU directive (85/432/EEC).[1] Although the RPSGB is informed about teaching activity in UK Schools of Pharmacy (SOPs), there is no database which aggregates information to provide the whole picture of pharmacy education within the UK. The aim of the teaching, learning and assessment study [2] was to document and map current programmes in the 16 established SOPs. Recent developments in programme delivery have resulted in a focus on deep learning (for example, through problem based learning approaches) and on being more student centred and less didactic through lectures. The specific objectives of this part of the study were (a) to quantify the content and modes of delivery of material as described in course documentation and (b) having categorised the range of teaching methods, ask students to rate how important they perceived each one for their own learning (using a three point Likert scale: very important, fairly important or not important). Material and methods-The study design compared three datasets: (1) quantitative course document review, (2) qualitative staff interview and (3) quantitative student self completion survey. All 16 SOPs provided a set of their undergraduate course documentation for the year 2003/4. The documentation variables were entered into Excel tables. A self-completion questionnaire was administered to all year four undergraduates, using a pragmatic mixture of methods, (n=1847) in 15 SOPs within Great Britain. The survey data were analysed (n=741) using SPSS, excluding non-UK students who may have undertaken part of their studies within a non-UK university. Results and discussion-Interviews showed that individual teachers and course module leaders determine the choice of teaching methods used. Content review of the documentary evidence showed that 51% of the taught element of the course was delivered using lectures, 31% using practicals (includes computer aided learning) and 18% small group or interactive teaching. There was high uniformity across the schools for the first three years; variation in the final year was due to the project. The average number of hours per year across 15 schools (data for one school were not available) was: year 1: 408 hours; year 2: 401 hours; year 3: 387 hours; year 4: 401 hours. The survey showed that students perceived lectures to be the most important method of teaching after dispensing or clinical practicals. Taking the very important rating only: 94% (n=694) dispensing or clinical practicals; 75% (n=558) lectures; 52% (n=386) workshops, 50% (n=369) tutorials, 43% (n=318) directed study. Scientific laboratory practices were rated very important by only 31% (n=227). The study shows that teaching of pharmacy to undergraduates in the UK is still essentially didactic through a high proportion of formal lectures and with high levels of staff-student contact. Schools consider lectures still to be the most cost effective means of delivering the core syllabus to large cohorts of students. However, this does limit the scope for any optionality within teaching, the scope for small group work is reduced as is the opportunity to develop multi-professional learning or practice placements. Although novel teaching and learning techniques such as e-learning have expanded considerably over the past decade, schools of pharmacy have concentrated on lectures as the best way of coping with the huge expansion in student numbers. References [1] Council Directive. Concerning the coordination of provisions laid down by law, regulation or administrative action in respect of certain activities in the field of pharmacy. Official Journal of the European Communities 1985;85/432/EEC. [2] Wilson K, Jesson J, Langley C, Clarke L, Hatfield K. MPharm Programmes: Where are we now? Report commissioned by the Pharmacy Practice Research Trust., 2005.
Resumo:
The acceleration of solid dosage form product development can be facilitated by the inclusion of excipients that exhibit poly-/multi-functionality with reduction of the time invested in multiple excipient optimisations. Because active pharmaceutical ingredients (APIs) and tablet excipients present diverse densification behaviours upon compaction, the involvement of these different powders during compaction makes the compaction process very complicated. The aim of this study was to assess the macrometric characteristics and distribution of surface charges of two powders: indomethacin (IND) and arginine (ARG); and evaluate their impact on the densification properties of the two powders. Response surface modelling (RSM) was employed to predict the effect of two independent variables; Compression pressure (F) and ARG percentage (R) in binary mixtures on the properties of resultant tablets. The study looked at three responses namely; porosity (P), tensile strength (S) and disintegration time (T). Micrometric studies showed that IND had a higher charge density (net charge to mass ratio) when compared to ARG; nonetheless, ARG demonstrated good compaction properties with high plasticity (Y=28.01MPa). Therefore, ARG as filler to IND tablets was associated with better mechanical properties of the tablets (tablet tensile strength (σ) increased from 0.2±0.05N/mm2 to 2.85±0.36N/mm2 upon adding ARG at molar ratio of 8:1 to IND). Moreover, tablets' disintegration time was shortened to reach few seconds in some of the formulations. RSM revealed tablet porosity to be affected by both compression pressure and ARG ratio for IND/ARG physical mixtures (PMs). Conversely, the tensile strength (σ) and disintegration time (T) for the PMs were influenced by the compression pressure, ARG ratio and their interactive term (FR); and a strong correlation was observed between the experimental results and the predicted data for tablet porosity. This work provides clear evidence of the multi-functionality of ARG as filler, binder and disintegrant for directly compressed tablets.