964 resultados para Systems development


Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the increasing importance that nanotechnologies have in everyday life, it is not difficult to realize that also a single molecule, if properly designed, can be a device able to perform useful functions: such a chemical species is called chemosensor, that is a molecule of abiotic origin that signals the presence of matter or energy. Signal transduction is the mechanism by which an interaction of a sensor with an analyte yields a measurable form of energy. When dealing with the design of a chemosensor, we need to take into account a “communication requirement” between its three component: the receptor unit, responsible for the selective analyte binding, the spacer, which controls the geometry of the system and modulates the electronic interaction between the receptor and the signalling unit, whose physico-chemical properties change upon complexation. A luminescent chemosensor communicates a variation of the physico-chemical properties of the receptor unit with a luminescence output signal. This thesis work consists in the characterization of new molecular and nanoparticle-based system which can be used as sensitive materials for the construction of new optical transduction devices able to provide information about the concentration of analytes in solution. In particular two direction were taken. The first is to continue in the development of new chemosensors, that is the first step for the construction of reliable and efficient devices, and in particular the work will be focused on chemosensors for metal ions for biomedical and environmental applications. The second is to study more efficient and complex organized systems, such as derivatized silica nanoparticles. These system can potentially have higher sensitivity than molecular systems, and present many advantages, like the possibility to be ratiometric, higher Stokes shifts and lower signal-to-noise ratio.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study investigates the growth and metabolite production of microorganisms causing spoilage of Atlantic cod (Gadus morhua) fillets packaged under air and modified atmosphere (60 % CO2, 40 % O2). Samples were provided by two different retailers (A and B). Storage of packaged fillets occurred at 4 °C and 8 °C. Microbiological quality and metabolite production of cod fillets stored in MAP 4 °C, MAP 8 °C and air were monitored during 13 days, 7 days and 3 days of storage, respectively. Volatile compounds concentration in the headspace were quantified by Selective ion flow tube mass spectrometry and a correlation with microbiological spoilage was studied. The onset of volatile compounds detection was observed to be mostly around 7 log cfu/g of total psychrotrophic count. Trimethylamine and dimethyl sulfide were found to be the dominant volatiles in all of the tested storage conditions, nevertheless there was no close correlation between concentrations of each main VOC and percentages of rejection based on sensory evaluation. According to results it was concluded that they cannot be considered as only indicators of the quality of cod fillets stored in modified atmosphere and air.  

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Book review of: Kendall, Ann Rodríguez, Abelardo. Desarrollo y Perspectivas de los Sistemas de Andenerías en los Andes Centrales del Perú (Development and Perspectives of Irrigated Terrace Systems in the Peruvian Central Andes). Cuzco, Peru. ISBN: 978-9972-691-93-5.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

To compare the MANKIN and OARSI cartilage histopathology assessment systems using human articular cartilage from a large number of donors across the adult age spectrum representing all levels of cartilage degradation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In recent years, Geographic Information Systems (GIS) have increasingly been used in a wide array of application contexts for development cooperation in lowlands and mountain areas. When used for planning, implementation, and monitoring, GIS is a versatile and highly efficient tool, particularly in mountain areas characterized by great spatial diversity and inaccessibility. However, the establishment and application of GIS in mountain regions generally presents considerable technical challenges. Moreover, it is necessary to address specific institutional and organizational issues regarding implementation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Feline immunodeficiency virus (FIV)-based gene transfer systems are being seriously considered for human gene therapy as an alternative to vectors based on primate lentiviruses, a genetically complex group of retroviruses capable of infecting non-dividing cells. The greater phylogenetic distance between the feline and primate lentiviruses is thought to reduce chances of the generation of recombinant viruses. However, safety of FIV-based vector systems has not been tested experimentally. Since primate lentiviruses such as human and simian immunodeficiency viruses (HIV/SIV) can cross-package each other's genomes, we tested this trait with respect to FIV. Unexpectedly, both feline and primate lentiviruses were reciprocally able to both cross-package and propagate each other's RNA genomes. This was largely due to the recognition of viral packaging signals by the heterologous proteins. However, a simple retrovirus such as Mason-Pfizer monkey virus (MPMV) was unable to package FIV RNA. Interestingly, FIV could package MPMV RNA, but not propagate it for further steps of replication. These findings suggest that upon co-infection of the same host, cross-packaging may allow distinct retroviruses to generate chimeric variants with unknown pathogenic potential. ^ In order to understand the packaging determinants in FIV, we conducted a detailed mutational analysis of the region thought to contain FIV packaging signal. We show that the first 90–120 nt of the 5′ untranslated region (UTR) and the first 90 nt of gag were simultaneously required for efficient FIV RNA packaging. These results suggest that the primary FIV packaging signal is multipartite and discontinuous, composed of two core elements separated by 150 nt of the 5 ′UTR. ^ The above studies are being used towards the development of safer FIV-based self-inactivating (SIN) vectors. These vectors are being designed to eliminate the ability of FIV transfer vector RNAs to be mobilized by primate lentiviral proteins that may be present in the target cells. Preliminary test of the first generation of these vectors has revealed that they are incapable of being propagated by feline proteins. The inability of FIV transfer vectors to express packageable vector RNA after integration should greatly increase the safety of FIV vectors for human gene therapy. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Public participation is increasingly advocated as a necessary feature of natural resources management. The EU Water Framework Directive (WFD) is such an example, as it prescribes participatory processes as necessary features in basin management plans (EC 2000). The rationale behind this mandate is that involving interest groups ideally yields higher-quality decisions, which are arguably more likely to meet public acceptance (Pahl-Wostl, 2006). Furthermore, failing to involve stakeholders in policy-making might hamper the implementation of management initiatives, as controversial decisions can lead pressure lobbies to generate public opposition (Giordano et al. 2005, Mouratiadou and Moran 2007).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents a testing methodology to apply Behaviour Driven Development (BDD) techniques while developing Multi-Agent Systems (MAS), so called BEhavioural Agent Simple Testing (BEAST) methodology. It is supported by the developed open source framework (BEAST Tool) which automatically generates test cases skeletons from BDD scenarios specifications. The developed framework allows testing MASs based on JADE or JADEX platforms and offers a set of configurable Mock Agents which allow the execution of tests while the system is under development. BEAST tool has been validated in the development of a MAS for fault diagnosis in FTTH (Fiber To The Home) networks.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Una apropiada evaluación de los márgenes de seguridad de una instalación nuclear, por ejemplo, una central nuclear, tiene en cuenta todas las incertidumbres que afectan a los cálculos de diseño, funcionanmiento y respuesta ante accidentes de dicha instalación. Una fuente de incertidumbre son los datos nucleares, que afectan a los cálculos neutrónicos, de quemado de combustible o activación de materiales. Estos cálculos permiten la evaluación de las funciones respuesta esenciales para el funcionamiento correcto durante operación, y también durante accidente. Ejemplos de esas respuestas son el factor de multiplicación neutrónica o el calor residual después del disparo del reactor. Por tanto, es necesario evaluar el impacto de dichas incertidumbres en estos cálculos. Para poder realizar los cálculos de propagación de incertidumbres, es necesario implementar metodologías que sean capaces de evaluar el impacto de las incertidumbres de estos datos nucleares. Pero también es necesario conocer los datos de incertidumbres disponibles para ser capaces de manejarlos. Actualmente, se están invirtiendo grandes esfuerzos en mejorar la capacidad de analizar, manejar y producir datos de incertidumbres, en especial para isótopos importantes en reactores avanzados. A su vez, nuevos programas/códigos están siendo desarrollados e implementados para poder usar dichos datos y analizar su impacto. Todos estos puntos son parte de los objetivos del proyecto europeo ANDES, el cual ha dado el marco de trabajo para el desarrollo de esta tesis doctoral. Por tanto, primero se ha llevado a cabo una revisión del estado del arte de los datos nucleares y sus incertidumbres, centrándose en los tres tipos de datos: de decaimiento, de rendimientos de fisión y de secciones eficaces. A su vez, se ha realizado una revisión del estado del arte de las metodologías para la propagación de incertidumbre de estos datos nucleares. Dentro del Departamento de Ingeniería Nuclear (DIN) se propuso una metodología para la propagación de incertidumbres en cálculos de evolución isotópica, el Método Híbrido. Esta metodología se ha tomado como punto de partida para esta tesis, implementando y desarrollando dicha metodología, así como extendiendo sus capacidades. Se han analizado sus ventajas, inconvenientes y limitaciones. El Método Híbrido se utiliza en conjunto con el código de evolución isotópica ACAB, y se basa en el muestreo por Monte Carlo de los datos nucleares con incertidumbre. En esta metodología, se presentan diferentes aproximaciones según la estructura de grupos de energía de las secciones eficaces: en un grupo, en un grupo con muestreo correlacionado y en multigrupos. Se han desarrollado diferentes secuencias para usar distintas librerías de datos nucleares almacenadas en diferentes formatos: ENDF-6 (para las librerías evaluadas), COVERX (para las librerías en multigrupos de SCALE) y EAF (para las librerías de activación). Gracias a la revisión del estado del arte de los datos nucleares de los rendimientos de fisión se ha identificado la falta de una información sobre sus incertidumbres, en concreto, de matrices de covarianza completas. Además, visto el renovado interés por parte de la comunidad internacional, a través del grupo de trabajo internacional de cooperación para evaluación de datos nucleares (WPEC) dedicado a la evaluación de las necesidades de mejora de datos nucleares mediante el subgrupo 37 (SG37), se ha llevado a cabo una revisión de las metodologías para generar datos de covarianza. Se ha seleccionando la actualización Bayesiana/GLS para su implementación, y de esta forma, dar una respuesta a dicha falta de matrices completas para rendimientos de fisión. Una vez que el Método Híbrido ha sido implementado, desarrollado y extendido, junto con la capacidad de generar matrices de covarianza completas para los rendimientos de fisión, se han estudiado diferentes aplicaciones nucleares. Primero, se estudia el calor residual tras un pulso de fisión, debido a su importancia para cualquier evento después de la parada/disparo del reactor. Además, se trata de un ejercicio claro para ver la importancia de las incertidumbres de datos de decaimiento y de rendimientos de fisión junto con las nuevas matrices completas de covarianza. Se han estudiado dos ciclos de combustible de reactores avanzados: el de la instalación europea para transmutación industrial (EFIT) y el del reactor rápido de sodio europeo (ESFR), en los cuales se han analizado el impacto de las incertidumbres de los datos nucleares en la composición isotópica, calor residual y radiotoxicidad. Se han utilizado diferentes librerías de datos nucleares en los estudios antreriores, comparando de esta forma el impacto de sus incertidumbres. A su vez, mediante dichos estudios, se han comparando las distintas aproximaciones del Método Híbrido y otras metodologías para la porpagación de incertidumbres de datos nucleares: Total Monte Carlo (TMC), desarrollada en NRG por A.J. Koning y D. Rochman, y NUDUNA, desarrollada en AREVA GmbH por O. Buss y A. Hoefer. Estas comparaciones demostrarán las ventajas del Método Híbrido, además de revelar sus limitaciones y su rango de aplicación. ABSTRACT For an adequate assessment of safety margins of nuclear facilities, e.g. nuclear power plants, it is necessary to consider all possible uncertainties that affect their design, performance and possible accidents. Nuclear data are a source of uncertainty that are involved in neutronics, fuel depletion and activation calculations. These calculations can predict critical response functions during operation and in the event of accident, such as decay heat and neutron multiplication factor. Thus, the impact of nuclear data uncertainties on these response functions needs to be addressed for a proper evaluation of the safety margins. Methodologies for performing uncertainty propagation calculations need to be implemented in order to analyse the impact of nuclear data uncertainties. Nevertheless, it is necessary to understand the current status of nuclear data and their uncertainties, in order to be able to handle this type of data. Great eórts are underway to enhance the European capability to analyse/process/produce covariance data, especially for isotopes which are of importance for advanced reactors. At the same time, new methodologies/codes are being developed and implemented for using and evaluating the impact of uncertainty data. These were the objectives of the European ANDES (Accurate Nuclear Data for nuclear Energy Sustainability) project, which provided a framework for the development of this PhD Thesis. Accordingly, first a review of the state-of-the-art of nuclear data and their uncertainties is conducted, focusing on the three kinds of data: decay, fission yields and cross sections. A review of the current methodologies for propagating nuclear data uncertainties is also performed. The Nuclear Engineering Department of UPM has proposed a methodology for propagating uncertainties in depletion calculations, the Hybrid Method, which has been taken as the starting point of this thesis. This methodology has been implemented, developed and extended, and its advantages, drawbacks and limitations have been analysed. It is used in conjunction with the ACAB depletion code, and is based on Monte Carlo sampling of variables with uncertainties. Different approaches are presented depending on cross section energy-structure: one-group, one-group with correlated sampling and multi-group. Differences and applicability criteria are presented. Sequences have been developed for using different nuclear data libraries in different storing-formats: ENDF-6 (for evaluated libraries) and COVERX (for multi-group libraries of SCALE), as well as EAF format (for activation libraries). A revision of the state-of-the-art of fission yield data shows inconsistencies in uncertainty data, specifically with regard to complete covariance matrices. Furthermore, the international community has expressed a renewed interest in the issue through the Working Party on International Nuclear Data Evaluation Co-operation (WPEC) with the Subgroup (SG37), which is dedicated to assessing the need to have complete nuclear data. This gives rise to this review of the state-of-the-art of methodologies for generating covariance data for fission yields. Bayesian/generalised least square (GLS) updating sequence has been selected and implemented to answer to this need. Once the Hybrid Method has been implemented, developed and extended, along with fission yield covariance generation capability, different applications are studied. The Fission Pulse Decay Heat problem is tackled first because of its importance during events after shutdown and because it is a clean exercise for showing the impact and importance of decay and fission yield data uncertainties in conjunction with the new covariance data. Two fuel cycles of advanced reactors are studied: the European Facility for Industrial Transmutation (EFIT) and the European Sodium Fast Reactor (ESFR), and response function uncertainties such as isotopic composition, decay heat and radiotoxicity are addressed. Different nuclear data libraries are used and compared. These applications serve as frameworks for comparing the different approaches of the Hybrid Method, and also for comparing with other methodologies: Total Monte Carlo (TMC), developed at NRG by A.J. Koning and D. Rochman, and NUDUNA, developed at AREVA GmbH by O. Buss and A. Hoefer. These comparisons reveal the advantages, limitations and the range of application of the Hybrid Method.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nowadays, organizations have plenty of data stored in DB databases, which contain invaluable information. Decision Support Systems DSS provide the support needed to manage this information and planning médium and long-term ?the modus operandi? of these organizations. Despite the growing importance of these systems, most proposals do not include its total evelopment, mostly limiting itself on the development of isolated parts, which often have serious integration problems. Hence, methodologies that include models and processes that consider every factor are necessary. This paper will try to fill this void as it proposes an approach for developing spatial DSS driven by the development of their associated Data Warehouse DW, without forgetting its other components. To the end of framing the proposal different Engineering Software focus (The Software Engineering Process and Model Driven Architecture) are used, and coupling with the DB development methodology, (and both of them adapted to DW peculiarities). Finally, an example illustrates the proposal.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The development of mixed-criticality virtualized multi-core systems poses new challenges that are being subject of active research work. There is an additional complexity: it is now required to identify a set of partitions, and allocate applications to partitions. In this job, a number of issues have to be considered, such as the criticality level of the application, security and dependability requirements, time requirements granularity, etc. MultiPARTES [11] toolset relies on Model Driven Engineering (MDE), which is a suitable approach in this setting, as it helps to bridge the gap between design issues and partitioning concerns. MDE is changing the way systems are developed nowadays, reducing development time. In general, modelling approaches have shown their benefits when applied to embedded systems. These benefits have been achieved by fostering reuse with an intensive use of abstractions, or automating the generation of boiler-plate code.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The importance of vision-based systems for Sense-and-Avoid is increasing nowadays as remotely piloted and autonomous UAVs become part of the non-segregated airspace. The development and evaluation of these systems demand flight scenario images which are expensive and risky to obtain. Currently Augmented Reality techniques allow the compositing of real flight scenario images with 3D aircraft models to produce useful realistic images for system development and benchmarking purposes at a much lower cost and risk. With the techniques presented in this paper, 3D aircraft models are positioned firstly in a simulated 3D scene with controlled illumination and rendering parameters. Realistic simulated images are then obtained using an image processing algorithm which fuses the images obtained from the 3D scene with images from real UAV flights taking into account on board camera vibrations. Since the intruder and camera poses are user-defined, ground truth data is available. These ground truth annotations allow to develop and quantitatively evaluate aircraft detection and tracking algorithms. This paper presents the software developed to create a public dataset of 24 videos together with their annotations and some tracking application results.