855 resultados para advanced editing
Resumo:
This work proposes to seek for the factors related to the choices that people with special educational needs make as the result of the visual impairment, during the transition stage from high school to advanced education. Therefore, we have taken into consideration that Vocational Guidance and the transition towards adulthood get specific characteristics in case of visually impaired young people, particularly in what's related to continue with advanced education. The focus of this work is to be able to clarify the existence of factors that make this transition stage easier or harder, through the observation of visually impaired and blind people who complete high school. This matter has aroused interest and concern about the strategies to follow to ensure the successful entrance and remaining in the selected advanced education. However, if we don?t know the factors involved in the described fact, it's difficult to design an appropriate intervention strategy. Then, in order to take acknowledge about the specific issues of visually impaired young people who complete high school, we chose a special school for this disability and some students who will join this project
Resumo:
This work proposes to seek for the factors related to the choices that people with special educational needs make as the result of the visual impairment, during the transition stage from high school to advanced education. Therefore, we have taken into consideration that Vocational Guidance and the transition towards adulthood get specific characteristics in case of visually impaired young people, particularly in what's related to continue with advanced education. The focus of this work is to be able to clarify the existence of factors that make this transition stage easier or harder, through the observation of visually impaired and blind people who complete high school. This matter has aroused interest and concern about the strategies to follow to ensure the successful entrance and remaining in the selected advanced education. However, if we don?t know the factors involved in the described fact, it's difficult to design an appropriate intervention strategy. Then, in order to take acknowledge about the specific issues of visually impaired young people who complete high school, we chose a special school for this disability and some students who will join this project
Resumo:
This work proposes to seek for the factors related to the choices that people with special educational needs make as the result of the visual impairment, during the transition stage from high school to advanced education. Therefore, we have taken into consideration that Vocational Guidance and the transition towards adulthood get specific characteristics in case of visually impaired young people, particularly in what's related to continue with advanced education. The focus of this work is to be able to clarify the existence of factors that make this transition stage easier or harder, through the observation of visually impaired and blind people who complete high school. This matter has aroused interest and concern about the strategies to follow to ensure the successful entrance and remaining in the selected advanced education. However, if we don?t know the factors involved in the described fact, it's difficult to design an appropriate intervention strategy. Then, in order to take acknowledge about the specific issues of visually impaired young people who complete high school, we chose a special school for this disability and some students who will join this project
Resumo:
Envisat Advanced Synthetic Aperture Radar (ASAR) Wide Swath Mode (WSM) images are used to derive C-band HH-polarization normalized radar cross sections (NRCS). These are compared with ice-core analysis and visual ship-based observations of snow and ice properties observed according to the Antarctic Sea Ice Processes and Climate (ASPeCt) protocol during two International Polar Year summer cruises (Oden 2008 and Palmer 2009) in West Antarctica. Thick first-year (TFY) and multi-year (MY) ice were the dominant ice types. The NRCS value ranges between -16.3 ± 1.1 and -7.6 ± 1.0 dB for TFY ice, and is -12.6 ± 1.3 dB for MY ice; for TFY ice, NRCS values increase from ~-15 dB to -9 dB from December/January to mid-February. In situ and ASPeCt observations are not, however, detailed enough to interpret the observed NRCS change over time. Co-located Advanced Microwave Scanning Radiometer-Earth Observing System (AMSR-E) vertically polarized 37 GHz brightness temperatures (TB37V), 7 day and 1 day averages as well as the TB37V difference between ascending and descending AMSR-E overpasses suggest the low NRCS values (-15 dB) are associated with snowmelt being still in progress, while the change towards higher NRCS values (-9dB) is caused by commencement of melt-refreeze cycles after about mid-January.
Resumo:
The consideration of real operating conditions for the design and optimization of a multijunction solar cell receiver-concentrator assembly is indispensable. Such a requirement involves the need for suitable modeling and simulation tools in order to complement the experimental work and circumvent its well-known burdens and restrictions. Three-dimensional distributed models have been demonstrated in the past to be a powerful choice for the analysis of distributed phenomena in single- and dual-junction solar cells, as well as for the design of strategies to minimize the solar cell losses when operating under high concentrations. In this paper, we present the application of these models for the analysis of triple-junction solar cells under real operating conditions. The impact of different chromatic aberration profiles on the short-circuit current of triple-junction solar cells is analyzed in detail using the developed distributed model. Current spreading conditions the impact of a given chromatic aberration profile on the solar cell I-V curve. The focus is put on determining the role of current spreading in the connection between photocurrent profile, subcell voltage and current, and semiconductor layers sheet resistance.
Resumo:
Nowadays, computer simulators are becoming basic tools for education and training in many engineering fields. In the nuclear industry, the role of simulation for training of operators of nuclear power plants is also recognized of the utmost relevance. As an example, the International Atomic Energy Agency sponsors the development of nuclear reactor simulators for education, and arranges the supply of such simulation programs. Aware of this, in 2008 Gas Natural Fenosa, a Spanish gas and electric utility that owns and operate nuclear power plants and promotes university education in the nuclear technology field, provided the Department of Nuclear Engineering of Universidad Politécnica de Madrid with the Interactive Graphic Simulator (IGS) of “José Cabrera” (Zorita) nuclear power plant, an industrial facility whose commercial operation ceased definitively in April 2006. It is a state-of-the-art full-scope real-time simulator that was used for training and qualification of the operators of the plant control room, as well as to understand and analyses the plant dynamics, and to develop, qualify and validate its emergency operating procedures.
Resumo:
Jóvenes Nucleares (Spanish Young Generation in Nuclear, JJNN) is a non-profit organization and a commission of the Spanish Nuclear Society (SNE). The Universidad Politécnica de Madrid (Technical University of Madrid, UPM) is one of the most prestigious technical universities of Spain, and has a very strong curriculum in nuclear engineering training and research. Finishing 2009, JJNN and the UPM started to plan a new and first-of-a-kind Seminar in Nuclear Safety focused on the Advanced Reactors (Generation III, III+ and IV). The scope was to make a general description of the safety in the new reactors, comparing them with the built Generation II reactors from a technical point of view but simple and without the need of strong background in nuclear engineering to try to be interesting for the most number of people possible.
Resumo:
This communication presents an overview of their first results and innovate methodologies, focused in their possibilities and limitations for the reconstruction of recent floods and paleofloods over the World.
Resumo:
Preliminary studies have been performed to design a device for nuclear waste transmutation and hydrogen generation based on a gas-cooled pebble bed accelerator driven system, TADSEA (Transmutation Advanced Device for Sustainable Energy Application). In previous studies we have addressed the viability of an ADS Transmutation device that uses as fuel wastes from the existing LWR power plants, encapsulated in graphite in the form of pebble beds, cooled by helium which enables high temperatures (in the order of 1200 K), to generate hydrogen from water either by high temperature electrolysis or by thermochemical cycles. For designing this device several configurations were studied, including several reflectors thickness, to achieve the desired parameters, the transmutation of nuclear waste and the production of 100 MW of thermal power. In this paper new studies performed on deep burn in-core fuel management strategy for LWR waste are presented. The fuel cycle on TADSEA device has been analyzed based on both: driven and transmutation fuel that had been proposed by the General Atomic design of a gas turbine-modular helium reactor. The transmutation results of the three fuel management strategies, using driven, transmutation and standard LWR spent fuel were compared, and several parameters describing the neutron performance of TADSEA nuclear core as the fuel and moderator temperature reactivity coefficients and transmutation chain, are also presented
Resumo:
Dentro de los paradigmas de programación en el mundo de la informática tenemos la "Programación Lógica'', cuyo principal exponente es el lenguaje Prolog. Los programas Prolog se componen de un conjunto de predicados, cada uno de ellos definido en base a reglas que aportan un elevado nivel de abstracción y declaratividad al programador. Sin embargo, las formulación con reglas implica, frecuentemente, que un predicado se recompute varias veces para la misma consulta y además, Prolog utiliza un orden fijo para evaluar reglas y objetivos (evaluación SLD) que puede entrar en "bucles infinitos'' cuando ejecuta reglas recursivas declarativamente correctas. Estas limitaciones son atacadas de raiz por la tabulación, que se basa en "recordar'' en una tabla las llamadas realizadas y sus soluciones. Así, en caso de repetir una llamada tendríamos ya disponibles sus soluciones y evitamos la recomputación. También evita "bucles infinitos'' ya que las llamadas que los generan son suspendidas, quedando a la espera de que se computen soluciones para las mismas. La implementación de la tabulación no es sencilla. En particular, necesita de tres operaciones que no pueden ser ejecutadas en tiempo constante simultáneamente. Dichas operaciones son: suspensión de llamadas, relanzamiento de llamadas y {acceso a variables. La primera parte de la tesis compara tres implementaciones de tabulación sobre Ciao, cada una de las cuales penaliza una de estas operaciones. Por tanto, cada solución tiene sus ventajas y sus inconvenientes y se comportan mejor o peor dependiendo del programa ejecutado. La segunda parte de la tesis mejora la funcionalidad de la tabulación para combinarla con restricciones y también para evitar computaciones innecesarias. La programación con restricciones permite la resolución de ecuaciones como medio de programar, mecanismo altamente declarativo. Hemos desarrollado un framework para combinar la tabulación con las restricciones, priorizando objetivos como la flexibilidad, la eficiencia y la generalidad de nuestra solución, obteniendo una sinergia entre ambas técnicas que puede ser aplicada en numerosas aplicaciones. Por otra parte, un aspecto fundamental de la tabulación hace referencia al momento en que se retornan las soluciones de una llamada tabulada. Local evaluation devuelve soluciones cuando todas las soluciones de la llamada tabulada han sido computadas. Por contra, batched evaluation devuelve las soluciones una a una conforme van siendo computadas, por lo que se adapta mejor a problemas donde no nos interesa encontrar todas las soluciones. Sin embargo, su consumo de memoria es exponencialmente peor que el de local evaluation. La tesis presenta swapping evaluation, que devuelve soluciones tan pronto como son computadas pero con un consumo de memoria similar a la de local evaluation. Además, se implementan operadores de poda, once/1, para descartar la búsqueda de soluciones alternativas cuando encontramos la solución deseada. Por último, Prolog adopta con relativa facilidad soluciones para paralelismo gracias a su flexibilidad en el control de la ejecución y a que sus asignaciones son lógicas. La tercera parte de la tesis extiende el paralelismo conjuntivo de Ciao para trabajar con programas no deterministas, lo que presenta dos problemas principales: los objetivos atrapados y la recomputación de objetivos. Las soluciones clásicas para los objetivos atrapados rompían muchos invariantes de la ejecución Prolog, siendo soluciones difíciles de mantener y de extender, que la experiencia nos dice que han caído en desuso. Nosotros proponemos una solución modular (basada en la implementación de swapping evaluation), localizada y que no rompe los invariantes de la ejecución Prolog, pero que mantiene un alto rendimiento de la ejecución paralela. En referencia a la recomputación de objetivos paralelos en presencia de no determinismo hemos adaptado ténicas derivadas de la tabulación para memorizar computaciones de estos objetivos y evitar su recomputación.
Resumo:
El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.
Resumo:
There are many industries that use highly technological solutions to improve quality in all of their products. The steel industry is one example. Several automatic surface-inspection systems are used in the steel industry to identify various types of defects and to help operators decide whether to accept, reroute, or downgrade the material, subject to the assessment process. This paper focuses on promoting a strategy that considers all defects in an integrated fashion. It does this by managing the uncertainty about the exact position of a defect due to different process conditions by means of Gaussian additive influence functions. The relevance of the approach is in making possible consistency and reliability between surface inspection systems. The results obtained are an increase in confidence in the automatic inspection system and an ability to introduce improved prediction and advanced routing models. The prediction is provided to technical operators to help them in their decision-making process. It shows the increase in improvement gained by reducing the 40 % of coils that are downgraded at the hot strip mill because of specific defects. In addition, this technology facilitates an increase of 50 % in the accuracy of the estimate of defect survival after the cleaning facility in comparison to the former approach. The proposed technology is implemented by means of software-based, multi-agent solutions. It makes possible the independent treatment of information, presentation, quality analysis, and other relevant functions.