848 resultados para Evaluation of proposals for new courses
Resumo:
Case-based reasoning (CBR) is a unique tool for the evaluation of possible failure of firms (EOPFOF) for its eases of interpretation and implementation. Ensemble computing, a variation of group decision in society, provides a potential means of improving predictive performance of CBR-based EOPFOF. This research aims to integrate bagging and proportion case-basing with CBR to generate a method of proportion bagging CBR for EOPFOF. Diverse multiple case bases are first produced by multiple case-basing, in which a volume parameter is introduced to control the size of each case base. Then, the classic case retrieval algorithm is implemented to generate diverse member CBR predictors. Majority voting, the most frequently used mechanism in ensemble computing, is finally used to aggregate outputs of member CBR predictors in order to produce final prediction of the CBR ensemble. In an empirical experiment, we statistically validated the results of the CBR ensemble from multiple case bases by comparing them with those of multivariate discriminant analysis, logistic regression, classic CBR, the best member CBR predictor and bagging CBR ensemble. The results from Chinese EOPFOF prior to 3 years indicate that the new CBR ensemble, which significantly improved CBRs predictive ability, outperformed all the comparative methods.
Resumo:
Governments are working in new policies to slow down total energy consumption and greenhouse gases (GHG) emissions, promoting the deployment of electric vehicles (EVs) in all countries. In order to facilitate this deployment and help to reduce the final costs of their batteries, additional utilization of EVs when those are parked has been proposed. EVs can be used to minimize the total electricity cost of buildings (named vehicle to building applications, V2B). In this paper an economic evaluation of EVs in the Building Energy Management System is shown. The optimal storage capacity and its equivalent number of EVs are determined. This value is then used for determining the optimal charging schedule to be applied to the batteries. From this schedule, the total expected profit is derived for the case of a real hotel in Spain.
Resumo:
ABSTRACT Evaluating the reliability, warranty period, and power degradation of high concentration solar cells is crucial to introducing this new technology to the market. The reliability of high concentration GaAs solar cells, as measured in temperature accelerated life tests, is described in this paper. GaAs cells were tested under high thermal accelerated conditions that emulated operation under 700 or 1050 suns over a period exceeding 10 000 h. Progressive power degradation was observed, although no catastrophic failures occurred. An Arrhenius activation energy of 1.02 eV was determined from these tests. The solar cell reliability [R(t)] under working conditions of 65°C was evaluated for different failure limits (1–10% power loss). From this reliability function, the mean time to failure and the warranty time were evaluated. Solar cell temperature appeared to be the primary determinant of reliability and warranty period, with concentration being the secondary determinant. A 30-year warranty for these 1 mm2-sized GaAs cells (manufactured according to a light emitting diode-like approach) may be offered for both cell concentrations (700 and 1050 suns) if the solar cell is operated at a working temperature of 65°C.
Resumo:
The Large Hadron Collider is the world’s largest and most powerful particle accelerator. The project is divided in phases. The first one goes from 2009 until 2020. The second phase will consist of the implementation of upgrades. One of the upgrades is to increase the ratio of collision, the luminosity. This objective is the main of one of the most important projects which is carrying out the upgrades: Hi-Lumi LHC project. Increasing luminosity could be done by using a new material in the superconductor magnets placed at the interaction points: Nb3Sn, instead of NbTi, the one being used right now. Before implementing it many aspects should be analysed. One of them is the induction magnetic field quality. The tool used so far has been ROXIE, software developed at CERN by S. Russenschuck. One of the main features of the programme is the time-transient analysis, which is based on three mathematical models. It is quite precise for fields above 1.5 Tesla. However, they are not very accurate for lower fields. Therefore the aim of this project is to evaluate a more accurate model: Classical Preisach Model of Hysteresis, in order to better analyse induced field quality in the new material Nb3Sn. Resumen: El Gran Colisionador de Hadrones es el mayor acelerador de partículas circular del mundo. Se trata de uno de los mayores proyectos de investigación. La primera fase de funcionamiento comprende desde 2009 a 2020, cuando comenzará la siguiente fase. Durante el primer periodo se han pensado mejoras para que puedan ser implementadas en la segunda fase. Una de ellas es el aumento del ratio de las colisiones entre protones por choque. Este es el principal objetivo de uno de los proyectos que está llevando a cabo las mejoras a ser implementadas en 2020: Hi- Lumi LHC. Se cambiarán los imanes superconductores de NbTi de las dos zonas principales de interacción, y se sustituirán por imanes de Nb3Sn. Esta sustituciónn conlleva un profundo estudio previo. Entre otros, uno de los factores a analizar es la calidad del campo magnético. La herramienta utilizada es el software desarrollado por S. Russenschuck en el CERN llamado ROXIE. Está basado en tres modelos de magnetización, los cuales son precisos para campos mayores de 1.5 T. Sin embargo, no lo son tanto para campos menores. Con este proyecto se pretende evaluar la implementación de un cuarto modelo, el modelo clásico de histéresis de Preisach que permita llevar a cabo un mejor análisis de la calidad del campo inducido por el futuro material a utilizar en algunos de los imanes.
Resumo:
This work presents the first application of total-reflection X-ray fluorescence (TXRF) spectrometry, a new and powerful alternative analytical method, to evaluation of the bioaccumulation kinetics of gold nanorods (GNRs) in various tissues upon intravenous administration in mice. The analytical parameters for developed methodology by TXRF were evaluated by means of the parallel analysis of bovine liver certified reference material samples (BCR-185R) doped with 10 μg/g gold. The average values (n = 5) achieved for gold measurements in lyophilized tissue weight were as follows: recovery 99.7%, expanded uncertainty (k = 2) 7%, repeatability 1.7%, detection limit 112 ng/g, and quantification limit 370 ng/g. The GNR bioaccumulation kinetics was analyzed in several vital mammalian organs such as liver, spleen, brain, and lung at different times. Additionally, urine samples were analyzed to study the kinetics of elimination of the GNRs by this excretion route. The main achievement was clearly differentiating two kinds of behaviors. GNRs were quickly bioaccumulated by highly vascular filtration organs such as liver and spleen, while GNRs do not show a bioaccumulation rates in brain and lung for the period of time investigated. In parallel, urine also shows a lack of GNR accumulation. TXRF has proven to be a powerful, versatile, and precise analytical technique for the evaluation of GNRs content in biological systems and, in a more general way, for any kind of metallic nanoparticles.
Resumo:
This article presents a new automatic evaluation for on-line graphics, its application and the numerous advantages achieved applying this developed correcting method. The software application developed by the Innovation in Education Group “E4”, from the Technical University of Madrid, is oriented for the online self-assessment of the graphic drawings that students carry out as continuous training. The adaptation to the European Higher Educational Area is an important opportunity to research about the possibilities of on-line education assessment. In this way, a new software tool has been developed for continuous self-testing by undergraduates. Using this software it is possible to evaluate the graphical answer of the students. Thus, the drawings made on-line by students are automatically corrected according to the geometry (straight lines, sloping lines or second order curves) and by sizes (depending on the specific values which define the graphics).
Resumo:
One important steps in a successful project-based-learning methodology (PBL) is the process of providing the students with a convenient feedback that allows them to keep on developing their projects or to improve them. However, this task is more difficult in massive courses, especially when the project deadline is close. Besides, the continuous evaluation methodology makes necessary to find ways to objectively and continuously measure students' performance without increasing excessively instructors' work load. In order to alleviate these problems, we have developed a web service that allows students to request personal tutoring assistance during the laboratory sessions by specifying the kind of problem they have and the person who could help them to solve it. This service provides tools for the staff to manage the laboratory, for performing continuous evaluation for all students and for the student collaborators, and to prioritize tutoring according to the progress of the student's project. Additionally, the application provides objective metrics which can be used at the end of the subject during the evaluation process in order to support some students' final scores. Different usability statistics and the results of a subjective evaluation with more than 330 students confirm the success of the proposed application.
Resumo:
La actividad volcánica interviene en multitud de facetas de la propia actividad humana, no siempre negativas. Sin embargo, son más los motivos de peligrosidad y riesgo que incitan al estudio de la actividad volcánica. Existen razones de seguridad que inciden en el mantenimiento del seguimiento y monitorización de la actividad volcánica para garantizar la vida y la seguridad de los asentamientos antrópicos en las proximidades de los edificios volcánicos. En esta tesis se define e implementa un sistema de monitorización de movimientos de la corteza en las islas de Tenerife y La Palma, donde el impacto social que representa un aumento o variación de la actividad volcánica en las islas es muy severo. Aparte de la alta densidad demográfica del Archipiélago, esta población aumenta significativamente, en diferentes periodos a lo largo del año, debido a la actividad turística que representa la mayor fuente de ingresos de las islas. La población y los centros turísticos se diseminan predominantemente a lo largo de las costas y también a lo largo de los flancos de los edificios volcánicos. Quizá el mantenimiento de estas estructuras sociales y socio-económicas son los motivos más importantes que justifican una monitorización de la actividad volcánica en las Islas Canarias. Recientemente se ha venido trabajando cada vez más en el intento de predecir la actividad volcánica utilizando los nuevos sistemas de monitorización geodésica, puesto que la actividad volcánica se manifiesta anteriormente por deformación de la corteza terrestre y cambios en la fuerza de la gravedad en la zona donde más tarde se registran eventos volcánicos. Los nuevos dispositivos y sensores que se han desarrollado en los últimos años en materias como la geodesia, la observación de la Tierra desde el espacio y el posicionamiento por satélite, han permitido observar y medir tanto la deformación producida en el terreno como los cambios de la fuerza de la gravedad antes, durante y posteriormente a los eventos volcánicos que se producen. Estos nuevos dispositivos y sensores han cambiado las técnicas o metodologías geodésicas que se venían utilizando hasta la aparición de los mismos, renovando métodos clásicos y desarrollando otros nuevos que ya se están afianzando como metodologías probadas y reconocidas para ser usadas en la monitorización volcánica. Desde finales de la década de los noventa del siglo pasado se han venido desarrollando en las Islas Canarias varios proyectos que han tenido como objetivos principales el desarrollo de nuevas técnicas de observación y monitorización por un lado y el diseño de una metodología de monitorización volcánica adecuada, por otro. Se presenta aquí el estudio y desarrollo de técnicas GNSS para la monitorización de deformaciones corticales y su campo de velocidades para las islas de Tenerife y La Palma. En su implementación, se ha tenido en cuenta el uso de la infraestructura geodésica y de monitorización existente en el archipiélago a fin de optimizar costes, además de complementarla con nuevas estaciones para dar una cobertura total a las dos islas. Los resultados obtenidos en los proyectos, que se describen en esta memoria, han dado nuevas perspectivas en la monitorización geodésica de la actividad volcánica y nuevas zonas de interés que anteriormente no se conocían en el entorno de las Islas Canarias. Se ha tenido especial cuidado en el tratamiento y propagación de los errores durante todo el proceso de observación, medida y proceso de los datos registrados, todo ello en aras de cuantificar el grado de fiabilidad de los resultados obtenidos. También en este sentido, los resultados obtenidos han sido verificados con otros procedentes de sistemas de observación radar de satélite, incorporando además a este estudio las implicaciones que el uso conjunto de tecnologías radar y GNSS tendrán en un futuro en la monitorización de deformaciones de la corteza terrestre. ABSTRACT Volcanic activity occurs in many aspects of human activity, and not always in a negative manner. Nonetheless, research into volcanic activity is more likely to be motivated by its danger and risk. There are security reasons that influence the monitoring of volcanic activity in order to guarantee the life and safety of human settlements near volcanic edifices. This thesis defines and implements a monitoring system of movements in the Earth’s crust in the islands of Tenerife and La Palma, where the social impact of an increase (or variation) of volcanic activity is very severe. Aside from the high demographic density of the archipelago, the population increases significantly in different periods throughout the year due to tourism, which represents a major source of revenue for the islands. The population and the tourist centres are mainly spread along the coasts and also along the flanks of the volcanic edifices. Perhaps the preservation of these social and socio-economic structures is the most important reason that justifies monitoring volcanic activity in the Canary Islands. Recently more and more work has been done with the intention of predicting volcanic activity, using new geodesic monitoring systems, since volcanic activity is evident prior to eruption because of a deformation of the Earth’s crust and changes in the force of gravity in the zone where volcanic events will later be recorded. The new devices and sensors that have been developed in recent years in areas such as geodesy, the observation of the Earth from space, and satellite positioning have allowed us to observe and measure the deformation produced in the Earth as well as the changes in the force of gravity before, during, and after the volcanic events occur. The new devices and sensors have changed the geodetic techniques and methodologies that were used previously. The classic methods have been renovated and other newer ones developed that are now vouched for as proven recognised methodologies to be used for volcanic monitoring. Since the end of the 1990s, in the Canary Islands various projects have been developed whose principal aim has been the development of new observation and monitoring techniques on the one hand, and the design of an appropriate volcanic monitoring methodology on the other. The study and development of GNSS techniques for the monitoring of crustal deformations and their velocity field is presented here. To carry out the study, the use of geodetic infrastructure and existing monitoring in the archipelago have been taken into account in order to optimise costs, besides complementing it with new stations for total coverage on both islands. The results obtained in the projects, which are described below, have produced new perspectives in the geodetic monitoring of volcanic activity and new zones of interest which previously were unknown in the environment of the Canary Islands. Special care has been taken with the treatment and propagation of errors during the entire process of observing, measuring, and processing the recorded data. All of this was done in order to quantify the degree of trustworthiness of the results obtained. Also in this sense, the results obtained have been verified with others from satellite radar observation systems, incorporating as well in this study the implications that the joint use of radar technologies and GNSS will have for the future of monitoring deformations in the Earth’s crust.
Resumo:
Prothoracicotropic hormone (PTTH) is the central cerebral neurohormone in insect development. Its release has been believed for decades to be confined to one (or two) critical moments early in each developmental stage at which time it triggers prolonged activation of the prothoracic glands to synthesize and release the steroid molting hormones (ecdysteroids), which elicit developmental responses in target tissues. We used an in vitro assay for PTTH released from excised brains of the bug Rhodnius prolixus and report that release of PTTH does occur at the expected time on day 6, but that this release is merely the first in a daily rhythm of release that continues throughout most of the 21 days of larval-adult development. This finding, together with reports of circadian control of ecdysteroid synthesis and titer throughout this time, raises significant challenges to several features of the current understanding of the hormonal control of insect development. New questions are raised concerning the function(s) of PTTH, its relationship with the prothoracic glands, and the significance of circadian rhythmicity throughout this endocrine axis. The significance of the reported observations derives from the set of entirely new questions they raise concerning the regulation of insect development.
Resumo:
ALICE is one of four major experiments of particle accelerator LHC installed in the European laboratory CERN. The management committee of the LHC accelerator has just approved a program update for this experiment. Among the upgrades planned for the coming years of the ALICE experiment is to improve the resolution and tracking efficiency maintaining the excellent particles identification ability, and to increase the read-out event rate to 100 KHz. In order to achieve this, it is necessary to update the Time Projection Chamber detector (TPC) and Muon tracking (MCH) detector modifying the read-out electronics, which is not suitable for this migration. To overcome this limitation the design, fabrication and experimental test of new ASIC named SAMPA has been proposed . This ASIC will support both positive and negative polarities, with 32 channels per chip and continuous data readout with smaller power consumption than the previous versions. This work aims to design, fabrication and experimental test of a readout front-end in 130nm CMOS technology with configurable polarity (positive/negative), peaking time and sensitivity. The new SAMPA ASIC can be used in both chambers (TPC and MCH). The proposed front-end is composed of a Charge Sensitive Amplifier (CSA) and a Semi-Gaussian shaper. In order to obtain an ASIC integrating 32 channels per chip, the design of the proposed front-end requires small area and low power consumption, but at the same time requires low noise. In this sense, a new Noise and PSRR (Power Supply Rejection Ratio) improvement technique for the CSA design without power and area impact is proposed in this work. The analysis and equations of the proposed circuit are presented which were verified by electrical simulations and experimental test of a produced chip with 5 channels of the designed front-end. The measured equivalent noise charge was <550e for 30mV/fC of sensitivity at a input capacitance of 18.5pF. The total core area of the front-end was 2300?m × 150?m, and the measured total power consumption was 9.1mW per channel.
Resumo:
The increasing economic competition drives the industry to implement tools that improve their processes efficiencies. The process automation is one of these tools, and the Real Time Optimization (RTO) is an automation methodology that considers economic aspects to update the process control in accordance with market prices and disturbances. Basically, RTO uses a steady-state phenomenological model to predict the process behavior, and then, optimizes an economic objective function subject to this model. Although largely implemented in industry, there is not a general agreement about the benefits of implementing RTO due to some limitations discussed in the present work: structural plant/model mismatch, identifiability issues and low frequency of set points update. Some alternative RTO approaches have been proposed in literature to handle the problem of structural plant/model mismatch. However, there is not a sensible comparison evaluating the scope and limitations of these RTO approaches under different aspects. For this reason, the classical two-step method is compared to more recently derivative-based methods (Modifier Adaptation, Integrated System Optimization and Parameter estimation, and Sufficient Conditions of Feasibility and Optimality) using a Monte Carlo methodology. The results of this comparison show that the classical RTO method is consistent, providing a model flexible enough to represent the process topology, a parameter estimation method appropriate to handle measurement noise characteristics and a method to improve the sample information quality. At each iteration, the RTO methodology updates some key parameter of the model, where it is possible to observe identifiability issues caused by lack of measurements and measurement noise, resulting in bad prediction ability. Therefore, four different parameter estimation approaches (Rotational Discrimination, Automatic Selection and Parameter estimation, Reparametrization via Differential Geometry and classical nonlinear Least Square) are evaluated with respect to their prediction accuracy, robustness and speed. The results show that the Rotational Discrimination method is the most suitable to be implemented in a RTO framework, since it requires less a priori information, it is simple to be implemented and avoid the overfitting caused by the Least Square method. The third RTO drawback discussed in the present thesis is the low frequency of set points update, this problem increases the period in which the process operates at suboptimum conditions. An alternative to handle this problem is proposed in this thesis, by integrating the classic RTO and Self-Optimizing control (SOC) using a new Model Predictive Control strategy. The new approach demonstrates that it is possible to reduce the problem of low frequency of set points updates, improving the economic performance. Finally, the practical aspects of the RTO implementation are carried out in an industrial case study, a Vapor Recompression Distillation (VRD) process located in Paulínea refinery from Petrobras. The conclusions of this study suggest that the model parameters are successfully estimated by the Rotational Discrimination method; the RTO is able to improve the process profit in about 3%, equivalent to 2 million dollars per year; and the integration of SOC and RTO may be an interesting control alternative for the VRD process.
Resumo:
The endangered Rio Grande silvery minnow (Hybognathus amarus; RGSM) is nearing extinction and requires immediate recovery actions. In a draft report to the Middle Rio Grande Program, a Program Advisory Panel (PAP) recommended 34 short-term recovery measures to be implemented within the next five years, and 21 long-term recovery actions. However, these recommendations need further analysis to determine if and how the actions could be implemented. This project evaluated short-term recommendations provided by the PAP to identify the most effective and rank the relative importance of the actions. These recommendations were divided into 7 recovery categories: population augmentation (5 recommendations); hydrologic regimen (4 recommendations); physical habitat (3 recommendations); biological component of habitat (2 recommendations); population monitoring (6 recommendations); monitoring, analysis, and modeling (12 recommendations); and information and planning (2 recommendations). Each recommendation was evaluated for its potential to produce anticipated recovery progress for the RGSM based on assessments of the degree of feasibility, cost effectiveness, and associated potential consequences if fully implemented. In addition, other alternatives in a range of options were also considered where applicable. In each case, recommendations were evaluated from on-site visits, interviews with researchers and resource managers, and literature review. Based on the research findings, three major groupings of the recovery categories were identified: natural aspects of recovery, program aspects of recovery, and emergency measures. At least one recovery category within each major grouping was considered important by the respondents, which indicates that each grouping is of high importance in relation to the situation at hand. However, actions within each grouping have varying priorities given constraints on available resources, time, and budget. An integrated approach that accounts for the complexity of the river system and the species itself was considered the best approach to avoid extinction while ensuring long-term sustainability. Results of these analyses were provided directly to the PAP for their review, and it is anticipated that project results could advance implementation of the most appropriate immediate recovery actions needed to prevent extinction of the RGSM.
Resumo:
Recently, many efforts have been made in the academic world to adapt the new degrees to the new European Higher Education Area (EHEA). New technologies have been the most important factor to carry out this adaptation. In particular, the tools 2.0 have been spreading quickly, not just the Web 2.0, but even in all the educational levels. Nevertheless, it is now necessary to evaluate whether all these efforts and all the changes, carried out in order to obtain improved academic performance among students, have provided good results. Therefore, the aim of this paper is focused on studying the impact of the implementation of information and communication technologies (ICTs) in a subject belonging to a Master from the University of Alicante in the academic year (2010-2011). In special, it is an elective course called "Advanced Visual Ergonomics" from the Master of Clinical Optometry and Vision. The methodology used to teach this course differs from the traditional one in many respects. For example, one of the resources used for the development of this course is a blog developed specifically to coordinate a series of virtual works, whose purpose is that the student goes into specific aspects of the current topic. Next, the student participates in an active role by writing a personal assessment on the blog. However, in the course planning, there is an attendance to lessons, where the teacher presents certain issues in a more traditional way, that is, with a lecture supported with audiovisual materials, such as materials generated in powerpoint. To evaluate the quality of the results achieved with this methodology, in this work the personal assessment of the students, who have completed this course during this academic year, are collected. In particular, we want to know their opinion about the used resources, as well as the followed methodology. The tool used to collect this information was a questionnaire. This questionnaire evaluates different aspects of the course: a general opinion, quality of the received information, satisfaction about the followed methodology and the student´s critical awareness. The design of this questionnaire is very important to get conclusive information about the methodology followed in the course. The questionnaire has to have an adequate number of questions; whether it has many questions, it might be boring for the student who would pay no enough attention. The questions should be well-written, with a clear structure and message, to avoid confusion and an ambiguity. The questions should be objectives, without any suggestion for a desired answer. In addition, the questionnaire should be interesting to encourage the student´ s interest. In conclusion, this questionnaire developed for this subject provided good information to evaluate whether the methodology was a useful tool to teach "Advanced Visual Ergonomics". Furthermore, the student´s opinion collected by this questionnaire might be very helpful to improve this didactic resource.
Resumo:
This article describes the adaptation and validation of the Distance Education Learning Environments Survey (DELES) for use in investigating the qualities found in distance and hybrid education psycho-social learning environments in Spain. As Europe moves toward post-secondary student mobility, equanimity in access to higher education, and more standardised degree programs across the European Higher Education Area (EHEA) the need for a high quality method for continually assessing the excellence of distance and hybrid learning environments has arisen. This study outlines how the English language DELES was adapted into the new Spanish-Distance Education Learning Environments Survey (S-DELES) for use with a Bachelor of Psychology and Criminology degree program offering both distance and hybrid education classes. We present the relationships between psycho-social learning environment perceptions and those of student affect. We also present the asynchronous aspects of the environment, scale means, and a comparison between the perceptions of distance education students and their hybrid education counterparts that inform the university about the baseline health of the information and communication technologies (ICT) environment within which the study was conducted.
Resumo:
This work explores the multi-element capabilities of inductively coupled plasma - mass spectrometry with collision/reaction cell technology (CCT-ICP-MS) for the simultaneous determination of both spectrally interfered and non-interfered nuclides in wine samples using a single set of experimental conditions. The influence of the cell gas type (i.e. He, He+H2 and He+NH3), cell gas flow rate and sample pre-treatment (i.e. water dilution or acid digestion) on the background-equivalent concentration (BEC) of several nuclides covering the mass range from 7 to 238 u has been studied. Results obtained in this work show that, operating the collision/reaction cell with a compromise cell gas flow rate (i.e. 4 mL min−1) improves BEC values for interfered nuclides without a significant effect on the BECs for non-interfered nuclides, with the exception of the light elements Li and Be. Among the different cell gas mixtures tested, the use of He or He+H2 is preferred over He+NH3 because NH3 generates new spectral interferences. No significant influence of the sample pre-treatment methodology (i.e. dilution or digestion) on the multi-element capabilities of CCT-ICP-MS in the context of simultaneous analysis of interfered and non-interfered nuclides was observed. Nonetheless, sample dilution should be kept at minimum to ensure that light nuclides (e.g. Li and Be) could be quantified in wine. Finally, a direct 5-fold aqueous dilution is recommended for the simultaneous trace and ultra-trace determination of spectrally interfered and non-interfered elements in wine by means of CCT-ICP-MS. The use of the CCT is mandatory for interference-free ultra-trace determination of Ti and Cr. Only Be could not be determined when using the CCT due to a deteriorated limit of detection when compared to conventional ICP-MS.