944 resultados para assessment, sociocultural theory, summative assessment. formative assessment


Relevância:

70.00% 70.00%

Publicador:

Resumo:

The design of nuclear power plant has to follow a number of regulations aimed at limiting the risks inherent in this type of installation. The goal is to prevent and to limit the consequences of any possible incident that might threaten the public or the environment. To verify that the safety requirements are met a safety assessment process is followed. Safety analysis is as key component of a safety assessment, which incorporates both probabilistic and deterministic approaches. The deterministic approach attempts to ensure that the various situations, and in particular accidents, that are considered to be plausible, have been taken into account, and that the monitoring systems and engineered safety and safeguard systems will be capable of ensuring the safety goals. On the other hand, probabilistic safety analysis tries to demonstrate that the safety requirements are met for potential accidents both within and beyond the design basis, thus identifying vulnerabilities not necessarily accessible through deterministic safety analysis alone. Probabilistic safety assessment (PSA) methodology is widely used in the nuclear industry and is especially effective in comprehensive assessment of the measures needed to prevent accidents with small probability but severe consequences. Still, the trend towards a risk informed regulation (RIR) demanded a more extended use of risk assessment techniques with a significant need to further extend PSA’s scope and quality. Here is where the theory of stimulated dynamics (TSD) intervenes, as it is the mathematical foundation of the integrated safety assessment (ISA) methodology developed by the CSN(Consejo de Seguridad Nuclear) branch of Modelling and Simulation (MOSI). Such methodology attempts to extend classical PSA including accident dynamic analysis, an assessment of the damage associated to the transients and a computation of the damage frequency. The application of this ISA methodology requires a computational framework called SCAIS (Simulation Code System for Integrated Safety Assessment). SCAIS provides accident dynamic analysis support through simulation of nuclear accident sequences and operating procedures. Furthermore, it includes probabilistic quantification of fault trees and sequences; and integration and statistic treatment of risk metrics. SCAIS comprehensively implies an intensive use of code coupling techniques to join typical thermal hydraulic analysis, severe accident and probability calculation codes. The integration of accident simulation in the risk assessment process and thus requiring the use of complex nuclear plant models is what makes it so powerful, yet at the cost of an enormous increase in complexity. As the complexity of the process is primarily focused on such accident simulation codes, the question of whether it is possible to reduce the number of required simulation arises, which will be the focus of the present work. This document presents the work done on the investigation of more efficient techniques applied to the process of risk assessment inside the mentioned ISA methodology. Therefore such techniques will have the primary goal of decreasing the number of simulation needed for an adequate estimation of the damage probability. As the methodology and tools are relatively recent, there is not much work done inside this line of investigation, making it a quite difficult but necessary task, and because of time limitations the scope of the work had to be reduced. Therefore, some assumptions were made to work in simplified scenarios best suited for an initial approximation to the problem. The following section tries to explain in detail the process followed to design and test the developed techniques. Then, the next section introduces the general concepts and formulae of the TSD theory which are at the core of the risk assessment process. Afterwards a description of the simulation framework requirements and design is given. Followed by an introduction to the developed techniques, giving full detail of its mathematical background and its procedures. Later, the test case used is described and result from the application of the techniques is shown. Finally the conclusions are presented and future lines of work are exposed.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

La planificación de la movilidad sostenible urbana es una tarea compleja que implica un alto grado de incertidumbre debido al horizonte de planificación a largo plazo, la amplia gama de paquetes de políticas posibles, la necesidad de una aplicación efectiva y eficiente, la gran escala geográfica, la necesidad de considerar objetivos económicos, sociales y ambientales, y la respuesta del viajero a los diferentes cursos de acción y su aceptabilidad política (Shiftan et al., 2003). Además, con las tendencias inevitables en motorización y urbanización, la demanda de terrenos y recursos de movilidad en las ciudades está aumentando dramáticamente. Como consecuencia de ello, los problemas de congestión de tráfico, deterioro ambiental, contaminación del aire, consumo de energía, desigualdades en la comunidad, etc. se hacen más y más críticos para la sociedad. Esta situación no es estable a largo plazo. Para enfrentarse a estos desafíos y conseguir un desarrollo sostenible, es necesario considerar una estrategia de planificación urbana a largo plazo, que aborde las necesarias implicaciones potencialmente importantes. Esta tesis contribuye a las herramientas de evaluación a largo plazo de la movilidad urbana estableciendo una metodología innovadora para el análisis y optimización de dos tipos de medidas de gestión de la demanda del transporte (TDM). La metodología nueva realizado se basa en la flexibilización de la toma de decisiones basadas en utilidad, integrando diversos mecanismos de decisión contrariedad‐anticipada y combinados utilidad‐contrariedad en un marco integral de planificación del transporte. La metodología propuesta incluye dos aspectos principales: 1) La construcción de escenarios con una o varias medidas TDM usando el método de encuesta que incorpora la teoría “regret”. La construcción de escenarios para este trabajo se hace para considerar específicamente la implementación de cada medida TDM en el marco temporal y marco espacial. Al final, se construyen 13 escenarios TDM en términos del más deseable, el más posible y el de menor grado de “regret” como resultado de una encuesta en dos rondas a expertos en el tema. 2) A continuación se procede al desarrollo de un marco de evaluación estratégica, basado en un Análisis Multicriterio de Toma de Decisiones (Multicriteria Decision Analysis, MCDA) y en un modelo “regret”. Este marco de evaluación se utiliza para comparar la contribución de los distintos escenarios TDM a la movilidad sostenible y para determinar el mejor escenario utilizando no sólo el valor objetivo de utilidad objetivo obtenido en el análisis orientado a utilidad MCDA, sino también el valor de “regret” que se calcula por medio del modelo “regret” MCDA. La función objetivo del MCDA se integra en un modelo de interacción de uso del suelo y transporte que se usa para optimizar y evaluar los impactos a largo plazo de los escenarios TDM previamente construidos. Un modelo de “regret”, llamado “referencedependent regret model (RDRM)” (modelo de contrariedad dependiente de referencias), se ha adaptado para analizar la contribución de cada escenario TDM desde un punto de vista subjetivo. La validación de la metodología se realiza mediante su aplicación a un caso de estudio en la provincia de Madrid. La metodología propuesta define pues un procedimiento técnico detallado para la evaluación de los impactos estratégicos de la aplicación de medidas de gestión de la demanda en el transporte, que se considera que constituye una herramienta de planificación útil, transparente y flexible, tanto para los planificadores como para los responsables de la gestión del transporte. Planning sustainable urban mobility is a complex task involving a high degree of uncertainty due to the long‐term planning horizon, the wide spectrum of potential policy packages, the need for effective and efficient implementation, the large geographical scale, the necessity to consider economic, social, and environmental goals, and the traveller’s response to the various action courses and their political acceptability (Shiftan et al., 2003). Moreover, with the inevitable trends on motorisation and urbanisation, the demand for land and mobility in cities is growing dramatically. Consequently, the problems of traffic congestion, environmental deterioration, air pollution, energy consumption, and community inequity etc., are becoming more and more critical for the society (EU, 2011). Certainly, this course is not sustainable in the long term. To address this challenge and achieve sustainable development, a long‐term perspective strategic urban plan, with its potentially important implications, should be established. This thesis contributes on assessing long‐term urban mobility by establishing an innovative methodology for optimizing and evaluating two types of transport demand management measures (TDM). The new methodology aims at relaxing the utility‐based decision‐making assumption by embedding anticipated‐regret and combined utilityregret decision mechanisms in an integrated transport planning framework. The proposed methodology includes two major aspects: 1) Construction of policy scenarios within a single measure or combined TDM policy‐packages using the survey method incorporating the regret theory. The purpose of building the TDM scenarios in this work is to address the specific implementation in terms of time frame and geographic scale for each TDM measure. Finally, 13 TDM scenarios are built in terms of the most desirable, the most expected and the least regret choice by means of the two‐round Delphi based survey. 2) Development of the combined utility‐regret analysis framework based on multicriteria decision analysis (MCDA). This assessment framework is used to compare the contribution of the TDM scenario towards sustainable mobility and to determine the best scenario considering not only the objective utility value obtained from the utilitybased MCDA, but also a regret value that is calculated via a regret‐based MCDA. The objective function of the utility‐based MCDA is integrated in a land use and transport interaction model and is used for optimizing and assessing the long term impacts of the constructed TDM scenarios. A regret based model, called referente dependent regret model (RDRM) is adapted to analyse the contribution of each TDM scenario in terms of a subjective point of view. The suggested methodology is implemented and validated in the case of Madrid. It defines a comprehensive technical procedure for assessing strategic effects of transport demand management measures, which can be useful, transparent and flexible planning tool both for planners and decision‐makers.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The impedance-based stability-assessment method has turned out to be a very effective tool and its usage is rapidly growing in different applications ranging from the conventional interconnected dc/dc systems to the grid-connected renewable energy systems. The results are sometime given as a certain forbidden region in the complex plane out of which the impedance ratio--known as minor-loop gain--shall stay for ensuring robust stability. This letter discusses the circle-like forbidden region occupying minimum area in the complex plane, defined by applying maximum peak criteria, which is well-known theory in control engineering. The investigation shows that the circle-like forbidden region will ensure robust stability only if the impedance-based minor-loop gain is determined at the very input or output of each subsystem within the interconnected system. Experimental evidence is provided based on a small-scale dc/dc distributed system.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper presents an assessment analysis of damage domains of the 30 MWth prototype High-Temperature Engineering Test Reactor (HTTR) operated by the Japan Atomic Energy Agency (JAEA). For this purpose, an in-house deterministic risk assessment computational tool was developed based on the Theory of Stimulated Dynamics (TSD). To illustrate the methodology and applicability of the developed modelling approach, assessment results of a control rod (CR) withdrawal accident during subcritical conditions are presented and compared with those obtained by the JAEA.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

When used appropriately, self- and peer-assessment are very effective learning tools. In the present work, instructor formative assessment and feedback, self-assessment (SA), and peer-assessment (PA) have been compared. During the first part of a semester, the students followed a continuous formative assessment. Subsequently, they were divided into two subgroups based on similar performances. One subgroup performed SAs, and the other followedPAduring the last part of the course. The performances of the two groups in solving problems were compared. Results suggest that PA is a more effective learning tool than SA, and both are more effective than instructor formative assessment. However, a survey that was conducted at the end of the experiment showed higher student confidence in instructor assessment than in PA. The students recognized the usefulness of acting as peer assessors, but believed that SA helped them more than PA.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The integration of geo-information from multiple sources and of diverse nature in developing mineral favourability indexes (MFIs) is a well-known problem in mineral exploration and mineral resource assessment. Fuzzy set theory provides a convenient framework to combine and analyse qualitative and quantitative data independently of their source or characteristics. A novel, data-driven formulation for calculating MFIs based on fuzzy analysis is developed in this paper. Different geo-variables are considered fuzzy sets and their appropriate membership functions are defined and modelled. A new weighted average-type aggregation operator is then introduced to generate a new fuzzy set representing mineral favourability. The membership grades of the new fuzzy set are considered as the MFI. The weights for the aggregation operation combine the individual membership functions of the geo-variables, and are derived using information from training areas and L, regression. The technique is demonstrated in a case study of skarn tin deposits and is used to integrate geological, geochemical and magnetic data. The study area covers a total of 22.5 km(2) and is divided into 349 cells, which include nine control cells. Nine geo-variables are considered in this study. Depending on the nature of the various geo-variables, four different types of membership functions are used to model the fuzzy membership of the geo-variables involved. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Document classification is a supervised machine learning process, where predefined category labels are assigned to documents based on the hypothesis derived from training set of labelled documents. Documents cannot be directly interpreted by a computer system unless they have been modelled as a collection of computable features. Rogati and Yang [M. Rogati and Y. Yang, Resource selection for domain-specific cross-lingual IR, in SIGIR 2004: Proceedings of the 27th annual international conference on Research and Development in Information Retrieval, ACM Press, Sheffied: United Kingdom, pp. 154-161.] pointed out that the effectiveness of document classification system may vary in different domains. This implies that the quality of document model contributes to the effectiveness of document classification. Conventionally, model evaluation is accomplished by comparing the effectiveness scores of classifiers on model candidates. However, this kind of evaluation methods may encounter either under-fitting or over-fitting problems, because the effectiveness scores are restricted by the learning capacities of classifiers. We propose a model fitness evaluation method to determine whether a model is sufficient to distinguish positive and negative instances while still competent to provide satisfactory effectiveness with a small feature subset. Our experiments demonstrated how the fitness of models are assessed. The results of our work contribute to the researches of feature selection, dimensionality reduction and document classification.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

How can empirical evidence of adverse effects from exposure to noxious agents, which is often incomplete and uncertain, be used most appropriately to protect human health? We examine several important questions on the best uses of empirical evidence in regulatory risk management decision-making raised by the US Environmental Protection Agency (EPA)'s science-policy concerning uncertainty and variability in human health risk assessment. In our view, the US EPA (and other agencies that have adopted similar views of risk management) can often improve decision-making by decreasing reliance on default values and assumptions, particularly when causation is uncertain. This can be achieved by more fully exploiting decision-theoretic methods and criteria that explicitly account for uncertain, possibly conflicting scientific beliefs and that can be fully studied by advocates and adversaries of a policy choice, in administrative decision-making involving risk assessment. The substitution of decision-theoretic frameworks for default assumption-driven policies also allows stakeholder attitudes toward risk to be incorporated into policy debates, so that the public and risk managers can more explicitly identify the roles of risk-aversion or other attitudes toward risk and uncertainty in policy recommendations. Decision theory provides a sound scientific way explicitly to account for new knowledge and its effects on eventual policy choices. Although these improvements can complicate regulatory analyses, simplifying default assumptions can create substantial costs to society and can prematurely cut off consideration of new scientific insights (e.g., possible beneficial health effects from exposure to sufficiently low 'hormetic' doses of some agents). In many cases, the administrative burden of applying decision-analytic methods is likely to be more than offset by improved effectiveness of regulations in achieving desired goals. Because many foreign jurisdictions adopt US EPA reasoning and methods of risk analysis, it may be especially valuable to incorporate decision-theoretic principles that transcend local differences among jurisdictions.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Project-based assessment, in the form of take-home exams, was trialed in an honours/masters level electromagnetic theory course. This assessment formed an integral part of the learning experience of the students, and students felt that this was effective method of learning.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Industry cluster policies are a current trend in local economic development programmes and represent a major shift from traditional approaches. This trend has been coupled by an increasing interest in new media industry as a significant focus for regional development strategies. In England clusters and new media industry have therefore come to be seen as important tools in promoting local and regional economic development. This study aimed to ascertain the success of these policies. In order to achieve the aims of the study, the Birmingham new media industry was chosen for the study. In addition to an extensive review of the literature, semi-structured interviews were conducted with new media firms and Business Support Agencies (BSAs) offering programmes to promote the development of the new media industry cluster. The key findings of the thesis are that the concerns of new media industry when choosing their location do not conform to the industry cluster theory. Moreover, close proximity in geographical location of the industries does not mean there is collaboration and any costs saved as a result of close proximity to similar firms are at present seen as irrelevant because of the type of products they offer. Building trust between firms is the key in developing the new media industry cluster and the BSAs can act as a broker and provide neutral ground to develop it. The key policy recommendations are that new media industry is continually changing and research must continuously track and analyse cluster dynamics in order to be aware of emerging trends and future developments that can positively and negatively affect the cluster. Policy makers need to keep in mind that there is no uniform tool kit to foster the different sectors in cluster development. It is also important for them to be winning support and trust of new media firms since this is key in the success of the cluster. When cluster programs are introduced they must explain their benefits to industries more effectively in order to encourage them to participate in programmes. The general conclusions of the thesis are that clusters are a potentially important tool in local economic development policy and that the new media industry has a considerable growth potential. The kinds of relationships which cluster theory suggests develop between do not, as yet, appear to exist within the new media cluster. There are however, steps that the BSAs can take to encourage their development. Thus, the BSAs need to ensure that they establish an environment that enables growth of the industry.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

New Technology Based Firms (NTBF) are considered to be important for the economic development of a country in regards to both employment growth and innovative activity. The latter is believed to contribute significantly to the increase in productivity and therefore the competitiveness of UK’s economy. This study contributes to the above literature by investigating two of the factors believed to limit the growth of such firms in the UK. The first concerns the existence of a ‘knowledge gap’ while the second the existence of a ‘financial gap’. These themes are developed along three main research lines. Firstly, based upon the human capital theory initially proposed by Backer (1964) new evidence is provided on the human capital characteristics (experience and education) of the current UK NTBF entrepreneurs. Secondly, the causal relationship between general and specific human capital (as well as their interactions) upon the company performance and growth is investigated via its traditional direct effect as well as via its indirect effect upon the access to external finance. Finally, more light is shed on the financial structure and the type of financial constraints that high-tech firms face at start-up. In particular, whether a financial gap exists is explored by distinguishing between the demand and the supply of external finance as well as by type of external source of financing. The empirical testing of the various research hypotheses has been obtained by carrying out an original survey of new technology based firms defined as independent companies, established in the past 25 years in R&D intensive sectors. The resulting dataset contains information for 412 companies on a number of general company characteristics and the characteristics of their entrepreneurs in 2004. Policy and practical implications for future and current entrepreneurs and also providers of external finance are provided.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Renewable energy project development is highly complex and success is by no means guaranteed. Decisions are often made with approximate or uncertain information yet the current methods employed by decision-makers do not necessarily accommodate this. Levelised energy costs (LEC) are one such commonly applied measure utilised within the energy industry to assess the viability of potential projects and inform policy. The research proposes a method for achieving this by enhancing the traditional discounting LEC measure with fuzzy set theory. Furthermore, the research develops the fuzzy LEC (F-LEC) methodology to incorporate the cost of financing a project from debt and equity sources. Applied to an example bioenergy project, the research demonstrates the benefit of incorporating fuzziness for project viability, optimal capital structure and key variable sensitivity analysis decision-making. The proposed method contributes by incorporating uncertain and approximate information to the widely utilised LEC measure and by being applicable to a wide range of energy project viability decisions. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The deterioration in staff-student ratios in UK higher education has had a disproportionate impact on assessment and feedback, meaning that contemporary students may have fewer assessments and much less feedback than a generation ago (Gibbs, 2006). Early use of a quiz assessment may offer a blend of social benefits (social comparison, shared problem solving leading to engagement, belonging and continuation), academic benefits (early formative assessment, immediate feedback) and administrative benefits (on-the-spot verbal marking and feedback to 230 students simultaneously). This study sought student views on the acceptability and contribution to learning of the quiz. Social benefits were apparent but difficulties in creating questions to elicit deeper reasoning and problem solving are discussed and the quiz had limited pedagogic value in the eyes of participants. The use of assertion-reason questions are considered as a way of taking the table quiz to a higher level and extending its pedagogic value.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Assessment criteria are increasingly incorporated into teaching, making it important to clarify the pedagogic status of the qualities to which they refer. We reviewed theory and evidence about the extent to which four core criteria for student writing-critical thinking, use of language, structuring, and argument-refer to the outcomes of three types of learning: generic skills learning, a deep approach to learning, and complex learning. The analysis showed that all four of the core criteria describe to some extent properties of text resulting from using skills, but none qualify fully as descriptions of the outcomes of applying generic skills. Most also describe certain aspects of the outcomes of taking a deep approach to learning. Critical thinking and argument correspond most closely to the outcomes of complex learning. At lower levels of performance, use of language and structuring describe the outcomes of applying transferable skills. At higher levels of performance, they describe the outcomes of taking a deep approach to learning. We propose that the type of learning required to meet the core criteria is most usefully and accurately conceptualized as the learning of complex skills, and that this provides a conceptual framework for maximizing the benefits of using assessment criteria as part of teaching. © 2006 Taylor & Francis.