905 resultados para Development of large software systems,
Resumo:
Quality data are not only relevant for successful Data Warehousing or Business Intelligence applications; they are also a precondition for efficient and effective use of Enterprise Resource Planning (ERP) systems. ERP professionals in all kinds of businesses are concerned with data quality issues, as a survey, conducted by the Institute of Information Systems at the University of Bern, has shown. This paper demonstrates, by using results of this survey, why data quality problems in modern ERP systems can occur and suggests how ERP researchers and practitioners can handle issues around the quality of data in an ERP software Environment.
Resumo:
A census of 925 U.S. colleges and universities offering masters and doctorate degrees was conducted in order to study the number of elements of an environmental management system as defined by ISO 14001 possessed by small, medium and large institutions. A 30% response rate was received with 273 responses included in the final data analysis. Overall, the number of ISO 14001 elements implemented among the 273 institutions ranged from 0 to 16, with a median of 12. There was no significant association between the number of elements implemented among institutions and the size of the institution (p = 0.18; Kruskal-Wallis test) or among USEPA regions (p = 0.12; Kruskal-Wallis test). The proportion of U.S. colleges and universities that reported having implemented a structured, comprehensive environmental management system, defined by answering yes to all 16 elements, was 10% (95% C.I. 6.6%–14.1%); however 38% (95% C.I. 32.0%–43.8%) reported that they had implemented a structured, comprehensive environmental management system, while 30.0% (95% C.I. 24.7%–35.9%) are planning to implement a comprehensive environmental management system within the next five years. Stratified analyses were performed by institution size, Carnegie Classification and job title. ^ The Osnabruck model, and another under development by the South Carolina Sustainable Universities Initiative, are the only two environmental management system models that have been proposed specifically for colleges and universities, although several guides are now available. The Environmental Management System Implementation Model for U.S. Colleges and Universities developed is an adaptation of the ISO 14001 standard and USEPA recommendations and has been tailored to U.S. colleges and universities for use in streamlining the implementation process. In using this implementation model created for the U.S. research and academic setting, it is hoped that these highly specialized institutions will be provided with a clearer and more cost-effective path towards the implementation of an EMS and greater compliance with local, state and federal environmental legislation. ^
Resumo:
Feline immunodeficiency virus (FIV)-based gene transfer systems are being seriously considered for human gene therapy as an alternative to vectors based on primate lentiviruses, a genetically complex group of retroviruses capable of infecting non-dividing cells. The greater phylogenetic distance between the feline and primate lentiviruses is thought to reduce chances of the generation of recombinant viruses. However, safety of FIV-based vector systems has not been tested experimentally. Since primate lentiviruses such as human and simian immunodeficiency viruses (HIV/SIV) can cross-package each other's genomes, we tested this trait with respect to FIV. Unexpectedly, both feline and primate lentiviruses were reciprocally able to both cross-package and propagate each other's RNA genomes. This was largely due to the recognition of viral packaging signals by the heterologous proteins. However, a simple retrovirus such as Mason-Pfizer monkey virus (MPMV) was unable to package FIV RNA. Interestingly, FIV could package MPMV RNA, but not propagate it for further steps of replication. These findings suggest that upon co-infection of the same host, cross-packaging may allow distinct retroviruses to generate chimeric variants with unknown pathogenic potential. ^ In order to understand the packaging determinants in FIV, we conducted a detailed mutational analysis of the region thought to contain FIV packaging signal. We show that the first 90–120 nt of the 5′ untranslated region (UTR) and the first 90 nt of gag were simultaneously required for efficient FIV RNA packaging. These results suggest that the primary FIV packaging signal is multipartite and discontinuous, composed of two core elements separated by 150 nt of the 5 ′UTR. ^ The above studies are being used towards the development of safer FIV-based self-inactivating (SIN) vectors. These vectors are being designed to eliminate the ability of FIV transfer vector RNAs to be mobilized by primate lentiviral proteins that may be present in the target cells. Preliminary test of the first generation of these vectors has revealed that they are incapable of being propagated by feline proteins. The inability of FIV transfer vectors to express packageable vector RNA after integration should greatly increase the safety of FIV vectors for human gene therapy. ^
Resumo:
An increasing number of studies have examined the effects of elevated carbon dioxide (CO2) and ocean acidification on marine fish, yet little is known about the effects on large pelagic fish. We tested the effects of elevated CO2 on the early life history development and behaviour of yellowtail kingfish, Seriola lalandi. Eggs and larvae were reared in current day control (450 µatm) and two elevated CO2 treatments for a total of 6 d, from 12 h post-fertilization until 3 d post-hatching (dph). Elevated CO2 treatments matched projections for the open ocean by the year 2100 under RCP 8.5 (880 µatm CO2) and a higher level (1700 µatm CO2) relevant to upwelling zones where pelagic fish often spawn. There was no effect of elevated CO2 on survival to hatching or 3 dph. Oil globule diameter decreased with an increasing CO2 level, indicating potential effects of elevated CO2 on energy utilization of newly hatched larvae, but other morphometric traits did not differ among treatments. Contrary to expectations, there were no effects of elevated CO2 on larval behaviour. Activity level, startle response, and phototaxis did not differ among treatments. Our results contrast with findings for reef fish, where a wide range of sensory and behavioural effects have been reported. We hypothesize that the absence of behavioural effects in 3 dph yellowtail kingfish is due to the early developmental state of newly hatched pelagic fish. Behavioural effects of high CO2 may not occur until larvae commence branchial acid-base regulation when the gills develop; however, further studies are required to test this hypothesis. Our results suggest that the early stages of kingfish development are tolerant to rising CO2 levels in the ocean.
Resumo:
Clustering small manufacturers are believed to attain various types of collective efficiency. A woodworking and furniture SME district in Uganda has created a learning environment for artisans to start up their own workshops. In the district workers can access various managerial information including business skills and input materials easily than outside. Hence it attracted new entrants to follow and district growth continued. On contrary large firms are locating separately and dispersedly from SME district and have a negative image to SME. This dichotomy has been created partly through spatial division of two sectors and partly through policy favouritism toward large firms.
Resumo:
Las gemas se evalúan mediante la norma de clasificación visual (UNE 56544), pero su aplicación en estructuras existentes y grandes escuadrías resulta poco eficaz y conduce a estimaciones demasiado conservadoras. Este trabajo analiza la influencia de las gemas comparando la resistencia de piezas con gemas y piezas correctamente escuadradas. Se han analizado 218 piezas de pino silvestre con dimensiones nominales 150 x 200 x 4.200 mm, de las que 102 presentaban una gema completa a lo largo de toda su longitud y el resto estaban correctamente escuadradas. En las piezas con gema se ha medido la altura de la sección cada 30 cm (altura en cada cara y altura máxima). Para determinar la resistencia se han ensayado todas las piezas de acuerdo a la norma EN 408. Se ha comparado la resistencia obtenida para las piezas con gema, diferenciando si la gema se encuentra en el borde comprimido o en el borde traccionado, con las piezas escuadradas. Puede concluirse que la presencia de gemas disminuye la resistencia excepto si la gema se encuentra en el borde traccionado, en cuyo caso los resultados obtenidos han sido similares a los de las piezas escuadradas. The wanes on structural timber are evaluated according to the visual grading standard (UNE 56544), but its application on existing structures and large cross sections is ineffective and leads to conservative estimations. This paper analyzes the influence of the wanes by comparing the resistance of pieces with wanes and square pieces. 218 pieces of Scotch pine with nominal dimensions 150 x 200 x 4200 mm have been analyzed, 102 of them had a complete wane along its length and the rest were properly squared. The height of the cross section was measured every 30 cm (the height on each side and the maximum height) for the pieces with wane. The bending strength of all the pieces was obtained according to the EN 408 standard. The bending strength of the pieces with wane has been compared with the strength of the squared pieces, taking into account if the wane is positioned on the compressed edge or on the tensioned edge. It can be concluded that the bending strength of the pieces with wanes is lower than the one of squared pieces, except if the wanes are on the tensioned edge of the beam.
Resumo:
In a degree course such as Forestry Engineering, the general teaching objectives consist of explaining and helping students to understand the principles of Mechanics. For some time now we have encountered significant difficulties in teaching this subject due to the students' lack of motivation and to their insufficient prior preparation for the topic. If we add to this the discipline's inherent complexity and the students' preconceptions about the subject, these teaching difficulties become considerable. For this reason a series of didactic activities have been introduced sequentially in the teaching of this subject. This work describes the methodology, procedure and results for the action of developing a work project in groups using Descartes software. The results of this experiment can be considered very positive. Some of the critical preconceptions for learning the subject can be corrected, and the tutoring process in the classroom contributes to an improvement in teacherstudent communication. Since this scheme was established, the number of students taking part each academic year has increased, and this is the group with the greatest percentage of passing scores.
Resumo:
Automated and semi-automated accessibility evaluation tools are key to streamline the process of accessibility assessment, and ultimately ensure that software products, contents, and services meet accessibility requirements. Different evaluation tools may better fit different needs and concerns, accounting for a variety of corporate and external policies, content types, invocation methods, deployment contexts, exploitation models, intended audiences and goals; and the specific overall process where they are introduced. This has led to the proliferation of many evaluation tools tailored to specific contexts. However, tool creators, who may be not familiar with the realm of accessibility and may be part of a larger project, lack any systematic guidance when facing the implementation of accessibility evaluation functionalities. Herein we present a systematic approach to the development of accessibility evaluation tools, leveraging the different artifacts and activities of a standardized development process model (the Unified Software Development Process), and providing templates of these artifacts tailored to accessibility evaluation tools. The work presented specially considers the work in progress in this area by the W3C/WAI Evaluation and Report Working Group (ERT WG)
Resumo:
Following the Integrated Water Resources Management approach, the European Water Framework Directive demands Member States to develop water management plans at the catchment level. Those plans have to integrate the different interests and must be developed with stakeholder participation. To face these requirements, managers need tools to assess the impacts of possible management alternatives on natural and socio-economic systems. These tools should ideally be able to address the complexity and uncertainties of the water system, while serving as a platform for stakeholder participation. The objective of our research was to develop a participatory integrated assessment model, based on the combination of a crop model, an economic model and a participatory Bayesian network, with an application in the middle Guadiana sub-basin, in Spain. The methodology is intended to capture the complexity of water management problems, incorporating the relevant sectors, as well as the relevant scales involved in water management decision making. The integrated model has allowed us testing different management, market and climate change scenarios and assessing the impacts of such scenarios on the natural system (crops), on the socio-economic system (farms) and on the environment (water resources). Finally, this integrated assessment modelling process has allowed stakeholder participation, complying with the main requirements of current European water laws.
Resumo:
The aim of this chapter is to discuss the applicability of recently proposed knowledge modelling tools to the development of agent-based systems. The discussion is derived from the real world experience of a particular software tool called KSM (Knowledge Structure Manager). The chapter provides details about this tool and then proceeds to show in which forms the software may be used to support the development of agent-based systems. Two multiagent systems, one in the field of telecommunications management and the other one in the field of flood control, are described. Conclusions about these studies are presented, summarizing the main contributions that knowledge modelling tools can bring to the development of agent-based systems.
Resumo:
Una apropiada evaluación de los márgenes de seguridad de una instalación nuclear, por ejemplo, una central nuclear, tiene en cuenta todas las incertidumbres que afectan a los cálculos de diseño, funcionanmiento y respuesta ante accidentes de dicha instalación. Una fuente de incertidumbre son los datos nucleares, que afectan a los cálculos neutrónicos, de quemado de combustible o activación de materiales. Estos cálculos permiten la evaluación de las funciones respuesta esenciales para el funcionamiento correcto durante operación, y también durante accidente. Ejemplos de esas respuestas son el factor de multiplicación neutrónica o el calor residual después del disparo del reactor. Por tanto, es necesario evaluar el impacto de dichas incertidumbres en estos cálculos. Para poder realizar los cálculos de propagación de incertidumbres, es necesario implementar metodologías que sean capaces de evaluar el impacto de las incertidumbres de estos datos nucleares. Pero también es necesario conocer los datos de incertidumbres disponibles para ser capaces de manejarlos. Actualmente, se están invirtiendo grandes esfuerzos en mejorar la capacidad de analizar, manejar y producir datos de incertidumbres, en especial para isótopos importantes en reactores avanzados. A su vez, nuevos programas/códigos están siendo desarrollados e implementados para poder usar dichos datos y analizar su impacto. Todos estos puntos son parte de los objetivos del proyecto europeo ANDES, el cual ha dado el marco de trabajo para el desarrollo de esta tesis doctoral. Por tanto, primero se ha llevado a cabo una revisión del estado del arte de los datos nucleares y sus incertidumbres, centrándose en los tres tipos de datos: de decaimiento, de rendimientos de fisión y de secciones eficaces. A su vez, se ha realizado una revisión del estado del arte de las metodologías para la propagación de incertidumbre de estos datos nucleares. Dentro del Departamento de Ingeniería Nuclear (DIN) se propuso una metodología para la propagación de incertidumbres en cálculos de evolución isotópica, el Método Híbrido. Esta metodología se ha tomado como punto de partida para esta tesis, implementando y desarrollando dicha metodología, así como extendiendo sus capacidades. Se han analizado sus ventajas, inconvenientes y limitaciones. El Método Híbrido se utiliza en conjunto con el código de evolución isotópica ACAB, y se basa en el muestreo por Monte Carlo de los datos nucleares con incertidumbre. En esta metodología, se presentan diferentes aproximaciones según la estructura de grupos de energía de las secciones eficaces: en un grupo, en un grupo con muestreo correlacionado y en multigrupos. Se han desarrollado diferentes secuencias para usar distintas librerías de datos nucleares almacenadas en diferentes formatos: ENDF-6 (para las librerías evaluadas), COVERX (para las librerías en multigrupos de SCALE) y EAF (para las librerías de activación). Gracias a la revisión del estado del arte de los datos nucleares de los rendimientos de fisión se ha identificado la falta de una información sobre sus incertidumbres, en concreto, de matrices de covarianza completas. Además, visto el renovado interés por parte de la comunidad internacional, a través del grupo de trabajo internacional de cooperación para evaluación de datos nucleares (WPEC) dedicado a la evaluación de las necesidades de mejora de datos nucleares mediante el subgrupo 37 (SG37), se ha llevado a cabo una revisión de las metodologías para generar datos de covarianza. Se ha seleccionando la actualización Bayesiana/GLS para su implementación, y de esta forma, dar una respuesta a dicha falta de matrices completas para rendimientos de fisión. Una vez que el Método Híbrido ha sido implementado, desarrollado y extendido, junto con la capacidad de generar matrices de covarianza completas para los rendimientos de fisión, se han estudiado diferentes aplicaciones nucleares. Primero, se estudia el calor residual tras un pulso de fisión, debido a su importancia para cualquier evento después de la parada/disparo del reactor. Además, se trata de un ejercicio claro para ver la importancia de las incertidumbres de datos de decaimiento y de rendimientos de fisión junto con las nuevas matrices completas de covarianza. Se han estudiado dos ciclos de combustible de reactores avanzados: el de la instalación europea para transmutación industrial (EFIT) y el del reactor rápido de sodio europeo (ESFR), en los cuales se han analizado el impacto de las incertidumbres de los datos nucleares en la composición isotópica, calor residual y radiotoxicidad. Se han utilizado diferentes librerías de datos nucleares en los estudios antreriores, comparando de esta forma el impacto de sus incertidumbres. A su vez, mediante dichos estudios, se han comparando las distintas aproximaciones del Método Híbrido y otras metodologías para la porpagación de incertidumbres de datos nucleares: Total Monte Carlo (TMC), desarrollada en NRG por A.J. Koning y D. Rochman, y NUDUNA, desarrollada en AREVA GmbH por O. Buss y A. Hoefer. Estas comparaciones demostrarán las ventajas del Método Híbrido, además de revelar sus limitaciones y su rango de aplicación. ABSTRACT For an adequate assessment of safety margins of nuclear facilities, e.g. nuclear power plants, it is necessary to consider all possible uncertainties that affect their design, performance and possible accidents. Nuclear data are a source of uncertainty that are involved in neutronics, fuel depletion and activation calculations. These calculations can predict critical response functions during operation and in the event of accident, such as decay heat and neutron multiplication factor. Thus, the impact of nuclear data uncertainties on these response functions needs to be addressed for a proper evaluation of the safety margins. Methodologies for performing uncertainty propagation calculations need to be implemented in order to analyse the impact of nuclear data uncertainties. Nevertheless, it is necessary to understand the current status of nuclear data and their uncertainties, in order to be able to handle this type of data. Great eórts are underway to enhance the European capability to analyse/process/produce covariance data, especially for isotopes which are of importance for advanced reactors. At the same time, new methodologies/codes are being developed and implemented for using and evaluating the impact of uncertainty data. These were the objectives of the European ANDES (Accurate Nuclear Data for nuclear Energy Sustainability) project, which provided a framework for the development of this PhD Thesis. Accordingly, first a review of the state-of-the-art of nuclear data and their uncertainties is conducted, focusing on the three kinds of data: decay, fission yields and cross sections. A review of the current methodologies for propagating nuclear data uncertainties is also performed. The Nuclear Engineering Department of UPM has proposed a methodology for propagating uncertainties in depletion calculations, the Hybrid Method, which has been taken as the starting point of this thesis. This methodology has been implemented, developed and extended, and its advantages, drawbacks and limitations have been analysed. It is used in conjunction with the ACAB depletion code, and is based on Monte Carlo sampling of variables with uncertainties. Different approaches are presented depending on cross section energy-structure: one-group, one-group with correlated sampling and multi-group. Differences and applicability criteria are presented. Sequences have been developed for using different nuclear data libraries in different storing-formats: ENDF-6 (for evaluated libraries) and COVERX (for multi-group libraries of SCALE), as well as EAF format (for activation libraries). A revision of the state-of-the-art of fission yield data shows inconsistencies in uncertainty data, specifically with regard to complete covariance matrices. Furthermore, the international community has expressed a renewed interest in the issue through the Working Party on International Nuclear Data Evaluation Co-operation (WPEC) with the Subgroup (SG37), which is dedicated to assessing the need to have complete nuclear data. This gives rise to this review of the state-of-the-art of methodologies for generating covariance data for fission yields. Bayesian/generalised least square (GLS) updating sequence has been selected and implemented to answer to this need. Once the Hybrid Method has been implemented, developed and extended, along with fission yield covariance generation capability, different applications are studied. The Fission Pulse Decay Heat problem is tackled first because of its importance during events after shutdown and because it is a clean exercise for showing the impact and importance of decay and fission yield data uncertainties in conjunction with the new covariance data. Two fuel cycles of advanced reactors are studied: the European Facility for Industrial Transmutation (EFIT) and the European Sodium Fast Reactor (ESFR), and response function uncertainties such as isotopic composition, decay heat and radiotoxicity are addressed. Different nuclear data libraries are used and compared. These applications serve as frameworks for comparing the different approaches of the Hybrid Method, and also for comparing with other methodologies: Total Monte Carlo (TMC), developed at NRG by A.J. Koning and D. Rochman, and NUDUNA, developed at AREVA GmbH by O. Buss and A. Hoefer. These comparisons reveal the advantages, limitations and the range of application of the Hybrid Method.
Resumo:
This paper presents the experimental three-year learning activity developed by a group of teachers in a wind tunnel facility. The authors, leading a team of students, carried out a project consisting of the design, assembly and testing of a wind tunnel. The project included all stages of the process from its initial specifications to its final quality flow assessments, going through the calculation of each element, and the building of the whole wind tunnel. The group of (final year) students was responsible for the whole wind tunnel project as a part of their bachelor degree project. The paper focuses on the development of wind tunnel data acquisition software. This automatic tool is essential to improve the automation of the data acquisition of the wind tunnel facility systems, in particular for a 6DOF multi-axis force/torque sensor. This work can be considered as a typical example of real engineering practice: a set of specifications that has to be modified due to the constraints imposed throughout the project, in order to obtain the final result
Resumo:
Automated Teller Machines (ATMs) are sensitive self-service systems that require important investments in security and testing. ATM certifications are testing processes for machines that integrate software components from different vendors and are performed before their deployment for public use. This project was originated from the need of optimization of the certification process in an ATM manufacturing company. The process identifies compatibility problems between software components through testing. It is composed by a huge number of manual user tasks that makes the process very expensive and error-prone. Moreover, it is not possible to fully automate the process as it requires human intervention for manipulating ATM peripherals. This project presented important challenges for the development team. First, this is a critical process, as all the ATM operations rely on the software under test. Second, the context of use of ATMs applications is vastly different from ordinary software. Third, ATMs’ useful lifetime is beyond 15 years and both new and old models need to be supported. Fourth, the know-how for efficient testing depends on each specialist and it is not explicitly documented. Fifth, the huge number of tests and their importance implies the need for user efficiency and accuracy. All these factors led us conclude that besides the technical challenges, the usability of the intended software solution was critical for the project success. This business context is the motivation of this Master Thesis project. Our proposal focused in the development process applied. By combining user-centered design (UCD) with agile development we ensured both the high priority of usability and the early mitigation of software development risks caused by all the technology constraints. We performed 23 development iterations and finally we were able to provide a working solution on time according to users’ expectations. The evaluation of the project was carried out through usability tests, where 4 real users participated in different tests in the real context of use. The results were positive, according to different metrics: error rate, efficiency, effectiveness, and user satisfaction. We discuss the problems found, the benefits and the lessons learned in the process. Finally, we measured the expected project benefits by comparing the effort required by the current and the new process (once the new software tool is adopted). The savings corresponded to 40% less effort (man-hours) per certification. Future work includes additional evaluation of product usability in a real scenario (with customers) and the measuring of benefits in terms of quality improvement.
Resumo:
The development of mixed-criticality virtualized multi-core systems poses new challenges that are being subject of active research work. There is an additional complexity: it is now required to identify a set of partitions, and allocate applications to partitions. In this job, a number of issues have to be considered, such as the criticality level of the application, security and dependability requirements, time requirements granularity, etc. MultiPARTES [11] toolset relies on Model Driven Engineering (MDE), which is a suitable approach in this setting, as it helps to bridge the gap between design issues and partitioning concerns. MDE is changing the way systems are developed nowadays, reducing development time. In general, modelling approaches have shown their benefits when applied to embedded systems. These benefits have been achieved by fostering reuse with an intensive use of abstractions, or automating the generation of boiler-plate code.