806 resultados para BEP(Beach Evaluation Program)
Resumo:
Information generated by abstract interpreters has long been used to perform program specialization. Additionally, if the abstract interpreter generates a multivariant analysis, it is also possible to perform múltiple specialization. Information about valúes of variables is propagated by simulating program execution and performing fixpoint computations for recursive calis. In contrast, traditional partial evaluators (mainly) use unfolding for both propagating valúes of variables and transforming the program. It is known that abstract interpretation is a better technique for propagating success valúes than unfolding. However, the program transformations induced by unfolding may lead to important optimizations which are not directly achievable in the existing frameworks for múltiple specialization based on abstract interpretation. The aim of this work is to devise a specialization framework which integrates the better information propagation of abstract interpretation with the powerful program transformations performed by partial evaluation, and which can be implemented via small modifications to existing generic abstract interpreters. With this aim, we will relate top-down abstract interpretation with traditional concepts in partial evaluation and sketch how the sophisticated techniques developed for controlling partial evaluation can be adapted to the proposed specialization framework. We conclude that there can be both practical and conceptual advantages in the proposed integration of partial evaluation and abstract interpretation.
Resumo:
This paper describes the language identification (LID) system developed by the Patrol team for the first phase of the DARPA RATS (Robust Automatic Transcription of Speech) program, which seeks to advance state of the art detection capabilities on audio from highly degraded communication channels. We show that techniques originally developed for LID on telephone speech (e.g., for the NIST language recognition evaluations) remain effective on the noisy RATS data, provided that careful consideration is applied when designing the training and development sets. In addition, we show significant improvements from the use of Wiener filtering, neural network based and language dependent i-vector modeling, and fusion.
Resumo:
The ex ante quantification of impactsis compulsory when establishing a Rural Development Program (RDP) in the European Union. Thus, the purpose of this paper is to learn how to perform it better. In order to this all of the European 2007-2013 RDPs (a total of 88) and all of their corresponding available ex ante evaluations were analyzed.Results show that less than 50% of all RDPs quantify all the impact indicators and that the most used methodology that allows the quantification of all impact indicators is Input-Output. There are two main difficulties cited for not accomplishing the impact quantification: the heterogeneity of actors and factors involved in the program impacts and the lack of needed information.These difficulties should be addressedby usingnew methods that allow approaching the complexity of the programs and by implementing a better planning that facilitatesgathering the needed information.
Resumo:
The latest technology and architectural trends have significantly improved the use of a large variety of glass products in construction which, in function of their own characteristocs, allow to design and calculate structural glass elements under safety conditions. This paper presents the evaluation and analysis of the damping properties of rectangular laminated glass plates of 1.938 m x 0.876 m with different thickness depending on the number of PVB interlayers arranged. By means of numerical simulation and experimental verification, using modal analysis, natural frequencies and damping of the glass plates were calculated, both under free boundary conditions and operational conditions for the impact test equipment used in the experimental program, as the European standard UNE-EN 12600:2003 specifies.
Resumo:
La gestión de los residuos radiactivos de vida larga producidos en los reactores nucleares constituye uno de los principales desafíos de la tecnología nuclear en la actualidad. Una posible opción para su gestión es la transmutación de los nucleidos de vida larga en otros de vida más corta. Los sistemas subcríticos guiados por acelerador (ADS por sus siglas en inglés) son una de las tecnologías en desarrollo para logar este objetivo. Un ADS consiste en un reactor nuclear subcrítico mantenido en un estado estacionario mediante una fuente externa de neutrones guiada por un acelerador de partículas. El interés de estos sistemas radica en su capacidad para ser cargados con combustibles que tengan contenidos de actínidos minoritarios mayores que los reactores críticos convencionales, y de esta manera, incrementar las tasas de trasmutación de estos elementos, que son los principales responsables de la radiotoxicidad a largo plazo de los residuos nucleares. Uno de los puntos clave que han sido identificados para la operación de un ADS a escala industrial es la necesidad de monitorizar continuamente la reactividad del sistema subcrítico durante la operación. Por esta razón, desde los años 1990 se han realizado varios experimentos en conjuntos subcríticos de potencia cero (MUSE, RACE, KUCA, Yalina, GUINEVERE/FREYA) con el fin de validar experimentalmente estas técnicas. En este contexto, la presente tesis se ocupa de la validación de técnicas de monitorización de la reactividad en el conjunto subcrítico Yalina-Booster. Este conjunto pertenece al Joint Institute for Power and Nuclear Research (JIPNR-Sosny) de la Academia Nacional de Ciencias de Bielorrusia. Dentro del proyecto EUROTRANS del 6º Programa Marco de la UE, en el año 2008 se ha realizado una serie de experimentos en esta instalación concernientes a la monitorización de la reactividad bajo la dirección del CIEMAT. Se han realizado dos tipos de experimentos: experimentos con una fuente de neutrones pulsada (PNS) y experimentos con una fuente continua con interrupciones cortas (beam trips). En el caso de los primeros, experimentos con fuente pulsada, existen dos técnicas fundamentales para medir la reactividad, conocidas como la técnica del ratio bajo las áreas de los neutrones inmediatos y retardados (o técnica de Sjöstrand) y la técnica de la constante de decaimiento de los neutrones inmediatos. Sin embargo, varios experimentos han mostrado la necesidad de aplicar técnicas de corrección para tener en cuenta los efectos espaciales y energéticos presentes en un sistema real y obtener valores precisos de la reactividad. En esta tesis, se han investigado estas correcciones mediante simulaciones del sistema con el código de Montecarlo MCNPX. Esta investigación ha servido también para proponer una versión generalizada de estas técnicas donde se buscan relaciones entre la reactividad el sistema y las cantidades medidas a través de simulaciones de Monte Carlo. El segundo tipo de experimentos, experimentos con una fuente continua e interrupciones del haz, es más probable que sea empleado en un ADS industrial. La versión generalizada de las técnicas desarrolladas para los experimentos con fuente pulsada también ha sido aplicada a los resultados de estos experimentos. Además, el trabajo presentado en esta tesis es la primera vez, en mi conocimiento, en que la reactividad de un sistema subcrítico se monitoriza durante la operación con tres técnicas simultáneas: la técnica de la relación entre la corriente y el flujo (current-to-flux), la técnica de desconexión rápida de la fuente (source-jerk) y la técnica del decaimiento de los neutrones inmediatos. Los casos analizados incluyen la variación rápida de la reactividad del sistema (inserción y extracción de las barras de control) y la variación rápida de la fuente de neutrones (interrupción larga del haz y posterior recuperación). ABSTRACT The management of long-lived radioactive wastes produced by nuclear reactors constitutes one of the main challenges of nuclear technology nowadays. A possible option for its management consists in the transmutation of long lived nuclides into shorter lived ones. Accelerator Driven Subcritical Systems (ADS) are one of the technologies in development to achieve this goal. An ADS consists in a subcritical nuclear reactor maintained in a steady state by an external neutron source driven by a particle accelerator. The interest of these systems lays on its capacity to be loaded with fuels having larger contents of minor actinides than conventional critical reactors, and in this way, increasing the transmutation rates of these elements, that are the main responsible of the long-term radiotoxicity of nuclear waste. One of the key points that have been identified for the operation of an industrial-scale ADS is the need of continuously monitoring the reactivity of the subcritical system during operation. For this reason, since the 1990s a number of experiments have been conducted in zero-power subcritical assemblies (MUSE, RACE, KUCA, Yalina, GUINEVERE/FREYA) in order to experimentally validate these techniques. In this context, the present thesis is concerned with the validation of reactivity monitoring techniques at the Yalina-Booster subcritical assembly. This assembly belongs to the Joint Institute for Power and Nuclear Research (JIPNR-Sosny) of the National Academy of Sciences of Belarus. Experiments concerning reactivity monitoring have been performed in this facility under the EUROTRANS project of the 6th EU Framework Program in year 2008 under the direction of CIEMAT. Two types of experiments have been carried out: experiments with a pulsed neutron source (PNS) and experiments with a continuous source with short interruptions (beam trips). For the case of the first ones, PNS experiments, two fundamental techniques exist to measure the reactivity, known as the prompt-to-delayed neutron area-ratio technique (or Sjöstrand technique) and the prompt neutron decay constant technique. However, previous experiments have shown the need to apply correction techniques to take into account the spatial and energy effects present in a real system and thus obtain accurate values for the reactivity. In this thesis, these corrections have been investigated through simulations of the system with the Monte Carlo code MCNPX. This research has also served to propose a generalized version of these techniques where relationships between the reactivity of the system and the measured quantities are obtained through Monte Carlo simulations. The second type of experiments, with a continuous source with beam trips, is more likely to be employed in an industrial ADS. The generalized version of the techniques developed for the PNS experiments has also been applied to the result of these experiments. Furthermore, the work presented in this thesis is the first time, to my knowledge, that the reactivity of a subcritical system has been monitored during operation simultaneously with three different techniques: the current-to-flux, the source-jerk and the prompt neutron decay techniques. The cases analyzed include the fast variation of the system reactivity (insertion and extraction of a control rod) and the fast variation of the neutron source (long beam interruption and subsequent recovery).
Resumo:
This paper presents shake-table tests conducted on a two-fifths-scale reinforced concrete frame representing a conventional construction design under current building code provisions in the Mediterranean area. The structure was subjected to a sequence of dynamic tests including free vibrations and four seismic simulations in which a historical ground motion record was scaled to levels of increasing intensity until collapse. Each seismic simulation was associated with a different level of seismic hazard, representing very frequent, frequent, rare and very rare earthquakes. The structure remained basically undamaged and within the inter-story drift limits of the "immediate occupancy" performance level for the very frequent and frequent earthquakes. For the rare earthquake, the specimen sustained significant damage with chord rotations of up to 28% of its ultimate capacity and approached the upper bound limit of inter-story drift associated with "life safety". The specimen collapsed at the beginning of the "very rare" seismic simulation. Besides summarizing the experimental program, this paper evaluates the damage quantitatively at the global and local levels in terms of chord rotation and other damage indexes, together with the energy dissipation demands for each level of seismic hazard. Further, the ratios of column-to-beam moment capacity recommended by Eurocode 8 and ACI-318 to guarantee the formation of a strong column-weak beam mechanism are examined.
Resumo:
Accreditation models in the international context mainly consider the evaluation of learning outcomes and the ability of programs (or higher education institutions) to achieve the educational objectives stated in their mission. However, it is not clear if these objectives and therefore their outcomes satisfy real national and regional needs, a critical point in engineering master's programs, especially in developing countries. The aim of this paper is to study the importance of the local relevancy evaluation of these programs and to analyze the main models of quality assurance and accreditation bodies of USA, Europe and Latin America, in order to ascertain whether the relevancy is evaluated or not. After a literature review, we found that in a free-market economic context and international education, the accreditation of master's programs follows an international accreditation model, and doesńt take in account in most cases criteria and indicators for local relevancy. It concludes that it is necessary both, international accreditation to ensure the effectiveness of the program (achievement of learning outcomes) and the national accreditation through which it could ensure local relevancy of programs, for which we are giving some indicators.
Resumo:
Presenter, student and teacher evaluation forms for the 6th Annual Lincoln University Sonia Kovalevsky Math for Girls Day program flyer on April 29, 2011.
Resumo:
Presenter, student and teacher evaluation forms for the 9th Annual Lincoln University Sonia Kovalevsky Math for Girls Day program flyer on April 25, 2014.
Resumo:
Acknowledgments Financial Support: HERU and HSRU receive a core grant from the Chief Scientist’s Office of the Scottish Government Health and Social Care Directorates, and the Centre for Clinical epidemiology & Evaluation is funded by Vancouver Coastal Health Authority. The model used for the illustrative case study in this paper was developed as part of a NHS Technology Assessment Review, funded by the National Institute for Health Research (NIHR) Health Technology Assessment Program (project number 09/146/01). The views and opinions expressed in this paper are those of the authors and do not necessarily reflect those of the Scottish Government, NHS, Vancouver Coastal Health, NIHR HTA Program or the Department of Health. The authors wish to thank Kathleen Boyd and members of the audience at the UK Health Economists Study Group, for comments received on an earlier version of this paper. We also wish to thank Cynthia Fraser (University of Aberdeen) for literature searches undertaken to inform the manuscript, and Mohsen Sadatsafavi (University of British Columbia) for comments on an earlier draft
Resumo:
Support for molecular biology researchers has been limited to traditional library resources and services in most academic health sciences libraries. The University of Washington Health Sciences Libraries have been providing specialized services to this user community since 1995. The library recruited a Ph.D. biologist to assess the molecular biological information needs of researchers and design strategies to enhance library resources and services. A survey of laboratory research groups identified areas of greatest need and led to the development of a three-pronged program: consultation, education, and resource development. Outcomes of this program include bioinformatics consultation services, library-based and graduate level courses, networking of sequence analysis tools, and a biological research Web site. Bioinformatics clients are drawn from diverse departments and include clinical researchers in need of tools that are not readily available outside of basic sciences laboratories. Evaluation and usage statistics indicate that researchers, regardless of departmental affiliation or position, require support to access molecular biology and genetics resources. Centralizing such services in the library is a natural synergy of interests and enhances the provision of traditional library resources. Successful implementation of a library-based bioinformatics program requires both subject-specific and library and information technology expertise.
Resumo:
ALICE is one of four major experiments of particle accelerator LHC installed in the European laboratory CERN. The management committee of the LHC accelerator has just approved a program update for this experiment. Among the upgrades planned for the coming years of the ALICE experiment is to improve the resolution and tracking efficiency maintaining the excellent particles identification ability, and to increase the read-out event rate to 100 KHz. In order to achieve this, it is necessary to update the Time Projection Chamber detector (TPC) and Muon tracking (MCH) detector modifying the read-out electronics, which is not suitable for this migration. To overcome this limitation the design, fabrication and experimental test of new ASIC named SAMPA has been proposed . This ASIC will support both positive and negative polarities, with 32 channels per chip and continuous data readout with smaller power consumption than the previous versions. This work aims to design, fabrication and experimental test of a readout front-end in 130nm CMOS technology with configurable polarity (positive/negative), peaking time and sensitivity. The new SAMPA ASIC can be used in both chambers (TPC and MCH). The proposed front-end is composed of a Charge Sensitive Amplifier (CSA) and a Semi-Gaussian shaper. In order to obtain an ASIC integrating 32 channels per chip, the design of the proposed front-end requires small area and low power consumption, but at the same time requires low noise. In this sense, a new Noise and PSRR (Power Supply Rejection Ratio) improvement technique for the CSA design without power and area impact is proposed in this work. The analysis and equations of the proposed circuit are presented which were verified by electrical simulations and experimental test of a produced chip with 5 channels of the designed front-end. The measured equivalent noise charge was <550e for 30mV/fC of sensitivity at a input capacitance of 18.5pF. The total core area of the front-end was 2300?m × 150?m, and the measured total power consumption was 9.1mW per channel.
Resumo:
Equine Assisted Activities and Therapies (EAAT) including Therapeutic Horseback Riding (THR) and un-mounted equine assisted activities are interventions aimed at improving the daily functioning and success of individuals with disabilities, including those with an autism spectrum disorder (ASD). While THR is frequently utilized as a treatment intervention for children with ASD, there are many limitations (individual's weight, horse health, weather, physical limitations, health conditions, etc.) that prevent this population from participating in mounted programs. Un-mounted equine assisted activities are often utilized as an alternative, but they are not informed by empirical research or a standardized treatment model. This paper provides a comprehensive review of the literature for EAAT including un-mounted programs, examination of organizational guidelines as they apply to un-mounted programs, and consultation with program directors regarding current practices in the field, and finally it establishes recommendations for the development of a standard curriculum that would strengthen un-mounted horse care group programs serving children with ASD.
Resumo:
As world communication, technology, and trade become increasingly integrated through globalization, multinational corporations seek employees with global leadership experience and skills. However, the demand for these skills currently outweighs the supply. Given the rarity of globally ready leaders, global competency development should be emphasized in higher education programs. The reality, however, is that university graduate programs are often outdated and focus mostly on cognitive learning. Global leadership competence requires moving beyond the cognitive domain of learning to create socially responsible and culturally connected global leaders. This requires attention to development methods; however, limited research in global leadership development methods has been conducted. A new conceptual model, the global leadership development ecosystem, was introduced in this study to guide the design and evaluation of global leadership development programs. It was based on three theories of learning and was divided into four development methodologies. This study quantitatively tested the model and used it as a framework for an in-depth examination of the design of one International MBA program. The program was first benchmarked, by means of a qualitative best practices analysis, against the top-ranking IMBA programs in the world. Qualitative data from students, faculty, administrators, and staff was then examined, using descriptive and focused data coding. Quantitative data analysis, using PASW Statistics software, and a hierarchical regression, showed the individual effect of each of the four development methods, as well as their combined effect, on student scores on a global leadership assessment. The analysis revealed that each methodology played a distinct and important role in developing different competencies of global leadership. It also confirmed the critical link between self-efficacy and global leadership development.