30 resultados para integrated assessment task
em Aston University Research Archive
Resumo:
Power generation from biomass is a sustainable energy technology which can contribute to substantial reductions in greenhouse gas emissions, but with greater potential for environmental, economic and social impacts than most other renewable energy technologies. It is important therefore in assessing bioenergy systems to take account of not only technical, but also environmental, economic and social parameters on a common basis. This work addresses the challenge of analysing, quantifying and comparing these factors for bioenergy power generation systems. A life-cycle approach is used to analyse the technical, environmental, economic and social impacts of entire bioelectricity systems, with a number of life-cycle indicators as outputs to facilitate cross-comparison. The results show that similar greenhouse gas savings are achieved with the wide variety of technologies and scales studied, but land-use efficiency of greenhouse gas savings and specific airborne emissions varied substantially. Also, while specific investment costs and electricity costs vary substantially from one system to another the number of jobs created per unit of electricity delivered remains roughly constant. Recorded views of stakeholders illustrate that diverging priorities exist for different stakeholder groups and this will influence appropriate choice of bioenergy systems for different applications.
Resumo:
The cross-country petroleum pipelines are environmentally sensitive because they traverse through varied terrain covering crop fields, forests, rivers, populated areas, desert, hills and offshore. Any malfunction of these pipelines may cause devastating effect on the environment. Hence, the pipeline operators plan and design pipelines projects with sufficient consideration of environment and social aspects along with the technological alternatives. Traditionally, in project appraisal, optimum technical alternative is selected using financial analysis. Impact assessments (IA) are then carried out to justify the selection and subsequent statutory approval. However, the IAs often suggest alternative sites and/or alternate technology and implementation methodology, resulting in revision of entire technical and financial analysis. This study addresses the above issues by developing an integrated framework for project feasibility analysis with the application of analytic hierarchy process (AHP), a multiple attribute decision-making technique. The model considers technical analysis (TA), socioeconomic IA (SEIA) and environmental IA (EIA) in an integrated framework to select the best project from a few alternative feasible projects. Subsequent financial analysis then justifies the selection. The entire methodology has been explained here through a case application on cross-country petroleum pipeline project in India.
Resumo:
This paper seeks to advance research and practice related to the role of employers in all stages of the assessment process of work-based learning (WBL) within a tripartite relationship of higher education institution (HEI), student and employer. It proposes a research-informed quality enhancement framework to develop good practice in engaging employers as partners in assessment. The Enhancement Framework comprises three dimensions, each of which includes elements and questions generated by the experiences of WBL students, HEI staff and employers. The three dimensions of the Enhancement Framework are: 1. ‘premises of assessment’ encompassing issues of learning, inclusion, standards and value; 2. ‘practice’, encompassing stages of assessment made up of course design, assessment task, responsibilities, support, grading and feedback; 3. ‘communication of assessment’ with the emphasis on role clarity, language and pathways. With its prompt questions, the Enhancement Framework may be used as a capacity-building tool for promoting, sustaining, benchmarking and evaluating productive dialogue and critical reflection about assessment between WBL partners. The paper concludes by emphasising the need for professional development as well as policy and research development, so that assessment in WBL can more closely correspond to the potentially transformative nature of the learning experience.
Resumo:
This action research (AR) study explores an alternative approach to vocabulary instruction for low-proficiency university students: a change from targeting individual words from the general service list (West, 1953) to targeting frequent verb + noun collocations. A review of the literature indicated a focus on collocations instead of individual words could potentially address the students’ productive challenges with targeted vocabulary. Over the course of four reflective cycles, this thesis addresses three main aspects of collocation instruction. First, it examines if the students believe studying collocations is more useful than studying individual lexical items. Second, the thesis investigates whether a focus on collocations will lead to improvements in spoken fluency. This is tested through a comparison of a pre-intervention spoken assessment task with the findings from the same task completed 15 weeks later, after the intervention. Third, the thesis explores different procedures for the instructing of collocations under the classroom constraints of a university teaching context. In the first of the four reflective cycles, data is collected which indicates that the students believe a focus on collocations is superior to only teaching individual lexical items, that in the students’ opinion their productive abilities with the targeted structures has improved, and that delexicalized verb collocations are problematic for low-proficiency students. Reflective cycle two produces evidence indicating that productive tasks are superior to receptive tasks for fluency development. In reflective cycle three, productively challenging classroom tasks are investigated further and the findings indicate that tasks with higher productive demands result in greater improvements in spoken fluency. The fourth reflective cycle uses a different type of collocation list: frequent adjective + noun collocations. Despite this change, the findings remain consistent in that certain types of collocations are problematic for low-proficiency language learners and that the evidence shows productive tasks are necessary to improve the students’ spoken ability.
Resumo:
Proteomics, the analysis of expressed proteins, has been an important developing area of research for the past two decades [Anderson, NG, Anderson, NL. Twenty years of two-dimensional electrophoresis: past, present and future. Electrophoresis 1996;17:443-53]. Advances in technology have led to a rapid increase in applications to a wide range of samples; from initial experiments using cell lines, more complex tissues and biological fluids are now being assessed to establish changes in protein expression. A primary aim of clinical proteomics is the identification of biomarkers for diagnosis and therapeutic intervention of disease, by comparing the proteomic profiles of control and disease, and differing physiological states. This expansion into clinical samples has not been without difficulties owing to the complexity and dynamic range in plasma and human tissues including tissue biopsies. The most widely used techniques for analysis of clinical samples are surface-enhanced laser desorption/ionisation mass spectrometry (SELDI-MS) and 2-dimensional gel electrophoresis (2-DE) coupled to matrix-assisted laser desorption ionisation [Person, MD, Monks, TJ, Lau, SS. An integrated approach to identifying chemically induced posttranslational modifications using comparative MALDI-MS and targeted HPLC-ESI-MS/MS. Chem. Res. Toxicol. 2003;16:598-608]-mass spectroscopy (MALDI-MS). This review aims to summarise the findings of studies that have used proteomic research methods to analyse samples from clinical studies and to assess the impact that proteomic techniques have had in assessing clinical samples. © 2004 The Canadian Society of Clinical Chemists. All rights reserved.
Resumo:
The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.
Resumo:
The evaluation and selection of industrial projects before investment decision is customarily done using marketing, technical and financial information. Subsequently, environmental impact assessment and social impact assessment are carried out mainly to satisfy the statutory agencies. Because of stricter environment regulations in developed and developing countries, quite often impact assessment suggests alternate sites, technologies, designs, and implementation methods as mitigating measures. This causes considerable delay to complete project feasibility analysis and selection as complete analysis requires to be taken up again and again till the statutory regulatory authority approves the project. Moreover, project analysis through above process often results sub-optimal project as financial analysis may eliminate better options, as more environment friendly alternative will always be cost intensive. In this circumstance, this study proposes a decision support system, which analyses projects with respect to market, technicalities, and social and environmental impact in an integrated framework using analytic hierarchy process, a multiple-attribute decision-making technique. This not only reduces duration of project evaluation and selection, but also helps select optimal project for the organization for sustainable development. The entire methodology has been applied to a cross-country oil pipeline project in India and its effectiveness has been demonstrated. © 2005 Elsevier B.V. All rights reserved.
Resumo:
Feasibility studies of industrial projects consist of multiple analyses carried out sequentially. This is time consuming and each analysis screens out alternatives based solely on the merits of that analysis. In cross-country petroleum pipeline project selection, market analysis determines throughput requirement and supply and demand points. Technical analysis identifies technological options and alternatives for pipe-line routes. Economic and financial analysis derive the least-cost option. The impact assessment addresses environmental issues. The impact assessment often suggests alternative sites, routes, technologies, and/or implementation methodology, necessitating revision of technical and financial analysis. This report suggests an integrated approach to feasibility analysis presented as a case application of a cross-country petroleum pipeline project in India.
Resumo:
Purpose - The purpose of the paper is to develop an integrated quality management model, which identifies problems, suggests solutions, develops a framework for implementation and helps evaluate performance of health care services dynamically. Design/methodology/approach - This paper uses logical framework analysis (LFA), a matrix approach to project planning for managing quality. This has been applied to three acute healthcare services (Operating room utilization, Accident and emergency, and Intensive care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This paper shows LFA application in three service processes in one hospital. However, ideally this is required to be tested in several hospitals and other services as well. Practical implications - In the paper the proposed model can be practised in hospital-based healthcare services for improving performance. Originality/value - The paper shows that quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in health care delivery and corrective measures are taken for superior performance, there is an absence of an integrated approach, which can identify and analyze issues, provide solutions to resolve those issues, develop a project management framework (planning, monitoring, and evaluating) to implement those solutions in order to improve process performance. This study introduces an integrated and uniform quality management tool. It integrates operations with organizational strategies. © Emerald Group Publishing Limited.
Resumo:
Aim To undertake a national study of teaching, learning and assessment in UK schools of pharmacy. Design Triangulation of course documentation, 24 semi-structured interviews undertaken with 29 representatives from the schools and a survey of all final year students (n=1,847) in the 15 schools within the UK during 2003–04. Subjects and setting All established UK pharmacy schools and final year MPharm students. Outcome measures Data were combined and analysed under the topics of curriculum, teaching and learning, assessment, multi-professional teaching and learning, placement education and research projects. Results Professional accreditation was the main driver for curriculum design but links to preregistration training were poor. Curricula were consistent but offered little student choice. On average half the curriculum was science-based. Staff supported the science content but students less so. Courses were didactic but schools were experimenting with new methods of learning. Examinations were the principal form of assessment but the contribution of practice to the final degree ranged considerably (21–63%). Most students considered the assessment load to be about right but with too much emphasis upon knowledge. Assessment of professional competence was focused upon dispensing and pharmacy law. All schools undertook placement teaching in hospitals but there was little in community/primary care. There was little inter-professional education. Resources and logistics were the major limiters. Conclusions There is a need for an integrated review of the accreditation process for the MPharm and preregistration training and redefinition of professional competence at an undergraduate level.
Resumo:
Recent functional magnetic resonance imaging (fMRI) investigations of the interaction between cognition and reward processing have found that the lateral prefrontal cortex (PFC) areas are preferentially activated to both increasing cognitive demand and reward level. Conversely, ventromedial PFC (VMPFC) areas show decreased activation to the same conditions, indicating a possible reciprocal relationship between cognitive and emotional processing regions. We report an fMRI study of a rewarded working memory task, in which we further explore how the relationship between reward and cognitive processing is mediated. We not only assess the integrity of reciprocal neural connections between the lateral PFC and VMPFC brain regions in different experimental contexts but also test whether additional cortical and subcortical regions influence this relationship. Psychophysiological interaction analyses were used as a measure of functional connectivity in order to characterize the influence of both cognitive and motivational variables on connectivity between the lateral PFC and the VMPFC. Psychophysiological interactions revealed negative functional connectivity between the lateral PFC and the VMPFC in the context of high memory load, and high memory load in tandem with a highly motivating context, but not in the context of reward alone. Physiophysiological interactions further indicated that the dorsal anterior cingulate and the caudate nucleus modulate this pathway. These findings provide evidence for a dynamic interplay between lateral PFC and VMPFC regions and are consistent with an emotional gating role for the VMPFC during cognitively demanding tasks. Our findings also support neuropsychological theories of mood disorders, which have long emphasized a dysfunctional relationship between emotion/motivational and cognitive processes in depression.
Resumo:
Microbial transglutaminase (mTGase) is an enzyme that introduces a covalent bond between peptide bound glutamine and lysine residues. Proteins cross-linked in this manner are often more resistant to proteolytic degradation and show increased tensile strength. This study evaluates the effects of mTGase mediated cross-linking of collagen on the cellular morphology, behaviour and viability of murine 3T3 fibroblasts following their seeding into collagen scaffolds. Additionally, cell mediated scaffold contraction, porosity and level of cross-linking of the scaffold has been analysed using image analysis software, scanning electron microscopy (SEM), colorimetric assays, and Fourier transform infrared spectroscopy (FTIR). We demonstrate that the biocompatibility and cellular morphology, when comparing cultures of fibroblasts integrated in mTGase cross-linked collagen scaffolds with the native collagen counterparts, remained unaffected. It has been also elicited that the structural characteristics of collagen have been preserved while introducing enzymatically resistant covalent bonds.
Resumo:
Health and safety policies may be regarded as the cornerstone for positive prevention of occupational accidents and diseases. The Health and Safety at Work, etc Act 1974 makes it a legal duty for employers to prepare and revise a written statement of a general policy with respect to the health and safety at work of employees as well as the organisation and arrangements for carrying out that policy. Despite their importance and the legal equipment to prepare them, health and safety policies have been found, in a large number of plastics processing companies (particularly small companies), to be poorly prepared, inadequately implemented and monitored. An important cause of these inadequacies is the lack of necessary health and safety knowledge and expertise to prepare, implement and monitor policies. One possible way of remedying this problem is to investigate the feasibility of using computers to develop expert system programs to simulate the health and safety (HS) experts' task of preparing the policies and assisting companies implement and monitor them. Such programs use artificial intelligence (AI) techniques to solve this sort of problems which are heuristic in nature and require symbolic reasoning. Expert systems have been used successfully in a variety of fields such as medicine and engineering. An important phase in the feasibility of development of such systems is the engineering of knowledge which consists of identifying the knowledge required, eliciting, structuring and representing it in an appropriate computer programming language.
Resumo:
A re-examination of fundamental concepts and a formal structuring of the waveform analysis problem is presented in Part I. eg. the nature of frequency is examined and a novel alternative to the classical methods of detection proposed and implemented which has the advantage of speed and independence from amplitude. Waveform analysis provides the link between Parts I and II. Part II is devoted to Human Factors and the Adaptive Task Technique. The Historical, Technical and Intellectual development of the technique is traced in a review which examines the evidence of its advantages relative to non-adaptive fixed task methods of training, skill assessment and man-machine optimisation. A second review examines research evidence on the effect of vibration on manual control ability. Findings are presented in terms of percentage increment or decrement in performance relative to performance without vibration in the range 0-0.6Rms'g'. Primary task performance was found to vary by as much as 90% between tasks at the same Rms'g'. Differences in task difficulty accounted for this difference. Within tasks vibration-added-difficulty accounted for the effects of vibration intensity. Secondary tasks were found to be largely insensitive to vibration except secondaries which involved fine manual adjustment of minor controls. Three experiments are reported next in which an adaptive technique was used to measure the % task difficulty added by vertical random and sinusoidal vibration to a 'Critical Compensatory Tracking task. At vibration intensities between 0 - 0.09 Rms 'g' it was found that random vibration added (24.5 x Rms'g')/7.4 x 100% to the difficulty of the control task. An equivalence relationship between Random and Sinusoidal vibration effects was established based upon added task difficulty. Waveform Analyses which were applied to the experimental data served to validate Phase Plane analysis and uncovered the development of a control and possibly a vibration isolation strategy. The submission ends with an appraisal of subjects mentioned in the thesis title.
Resumo:
Task classification is introduced as a method for the evaluation of monitoring behaviour in different task situations. On the basis of an analysis of different monitoring tasks, a task classification system comprising four task 'dimensions' is proposed. The perceptual speed and flexibility of closure categories, which are identified with signal discrimination type, comprise the principal dimension in this taxonomy, the others being sense modality, the time course of events, and source complexity. It is also proposed that decision theory provides the most complete method for the analysis of performance in monitoring tasks. Several different aspects of decision theory in relation to monitoring behaviour are described. A method is also outlined whereby both accuracy and latency measures of performance may be analysed within the same decision theory framework. Eight experiments and an organizational study are reported. The results show that a distinction can be made between the perceptual efficiency (sensitivity) of a monitor and his criterial level of response, and that in most monitoring situations, there is no decrement in efficiency over the work period, but an increase in the strictness of the response criterion. The range of tasks exhibiting either or both of these performance trends can be specified within the task classification system. In particular, it is shown that a sensitivity decrement is only obtained for 'speed' tasks with a high stimulation rate. A distinctive feature of 'speed' tasks is that target detection requires the discrimination of a change in a stimulus relative to preceding stimuli, whereas in 'closure' tasks, the information required for the discrimination of targets is presented at the same point In time. In the final study, the specification of tasks yielding sensitivity decrements is shown to be consistent with a task classification analysis of the monitoring literature. It is also demonstrated that the signal type dimension has a major influence on the consistency of individual differences in performance in different tasks. The results provide an empirical validation for the 'speed' and 'closure' categories, and suggest that individual differences are not completely task specific but are dependent on the demands common to different tasks. Task classification is therefore shovn to enable improved generalizations to be made of the factors affecting 1) performance trends over time, and 2) the consistencv of performance in different tasks. A decision theory analysis of response latencies is shown to support the view that criterion shifts are obtained in some tasks, while sensitivity shifts are obtained in others. The results of a psychophysiological study also suggest that evoked potential latency measures may provide temporal correlates of criterion shifts in monitoring tasks. Among other results, the finding that the latencies of negative responses do not increase over time is taken to invalidate arousal-based theories of performance trends over a work period. An interpretation in terms of expectancy, however, provides a more reliable explanation of criterion shifts. Although the mechanisms underlying the sensitivity decrement are not completely clear, the results rule out 'unitary' theories such as observing response and coupling theory. It is suggested that an interpretation in terms of the memory data limitations on information processing provides the most parsimonious explanation of all the results in the literature relating to sensitivity decrement. Task classification therefore enables the refinement and selection of theories of monitoring behaviour in terms of their reliability in generalizing predictions to a wide range of tasks. It is thus concluded that task classification and decision theory provide a reliable basis for the assessment and analysis of monitoring behaviour in different task situations.