942 resultados para software, translation, validation tool, VMNET, Wikipedia, XML


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents the main elements of a project entitled ICT-Emissions that aims at developing a novel methodology to evaluate the impact of ICT-related measures on mobility, vehicle energy consumption and CO2 emissions of vehicle fleets at the local scale, in order to promote the wider application of the most appropriate ICT measures. The proposed methodology combines traffic and emission modelling at micro and macro scales. These will be linked with interfaces and submodules which will be specifically designed and developed. A number of sources are available to the consortium to obtain the necessary input data. Also, experimental campaigns are offered to fill in gaps of information in traffic and emission patterns. The application of the methodology will be demonstrated using commercially available software. However, the methodology is developed in such a way as to enable its implementation by a variety of emission and traffic models. Particular emphasis is given to (a) the correct estimation of driver behaviour, as a result of traffic-related ICT measures, (b) the coverage of a large number of current vehicle technologies, including ICT systems, and (c) near future technologies such as hybrid, plug-in hybrids, and electric vehicles. The innovative combination of traffic, driver, and emission models produces a versatile toolbox that can simulate the impact on energy and CO2 of infrastructure measures (traffic management, dynamic traffic signs, etc.), driver assistance systems and ecosolutions (speed/cruise control, start/stop systems, etc.) or a combination of measures (cooperative systems).The methodology is validated by application in the Turin area and its capacity is further demonstrated by application in real world conditions in Madrid and Rome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 12 January 2010, an earthquake hit the city of Port-au-Prince, capital of Haiti. The earthquake reached a magnitude Mw 7.0 and the epicenter was located near the town of Léogâne, approximately 25 km west of the capital. The earthquake occurred in the boundary region separating the Caribbean plate and the North American plate. This plate boundary is dominated by left-lateral strike slip motion and compression, and accommodates about 20 mm/y slip, with the Caribbean plate moving eastward with respect to the North American plate (DeMets et al., 2000). Initially the location and focal mechanism of the earthquake seemed to involve straightforward accommodation of oblique relative motion between the Caribbean and North American plates along the Enriquillo-Plantain Garden fault system (EPGFZ), however Hayes et al., (2010) combined seismological observations, geologic field data and space geodetic measurements to show that, instead, the rupture process involved slip on multiple faults. Besides, the authors showed that remaining shallow shear strain will be released in future surface-rupturing earthquakes on the EPGFZ. In December 2010, a Spanish cooperation project financed by the Politechnical University of Madrid started with a clear objective: Evaluation of seismic hazard and risk in Haiti and its application to the seismic design, urban planning, emergency and resource management. One of the tasks of the project was devoted to vulnerability assessment of the current building stock and the estimation of seismic risk scenarios. The study was carried out by following the capacity spectrum method as implemented in the software SELENA (Molina et al., 2010). The method requires a detailed classification of the building stock in predominant building typologies (according to the materials in the structure and walls, number of stories and age of construction) and the use of the building (residential, commercial, etc.). Later, the knowledge of the soil characteristics of the city and the simulation of a scenario earthquake will provide the seismic risk scenarios (damaged buildings). The initial results of the study show that one of the highest sources of uncertainties comes from the difficulty of achieving a precise building typologies classification due to the craft construction without any regulations. Also it is observed that although the occurrence of big earthquakes usually helps to decrease the vulnerability of the cities due to the collapse of low quality buildings and the reconstruction of seismically designed buildings, in the case of Port-au-Prince the seismic risk in most of the districts remains high, showing very vulnerable areas. Therefore the local authorities have to drive their efforts towards the quality control of the new buildings, the reinforcement of the existing building stock, the establishment of seismic normatives and the development of emergency planning also through the education of the population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El presente proyecto fin de carrera, realizado por el ingeniero técnico en telecomunicaciones Pedro M. Matamala Lucas, es la fase final de desarrollo de un proyecto de mayor magnitud correspondiente al software de vídeo forense SAVID. El propósito del proyecto en su totalidad es la creación de una herramienta informática capacitada para realizar el análisis de ficheros de vídeo, codificados y comprimidos por el sistema DV –Digital Video-. El objetivo del análisis, es aportar información acerca de si la cinta magnética presenta indicios de haber sido manipulada con una edición posterior a su grabación original, además, de mostrar al usuario otros datos de interés como las especificaciones técnicas de la señal de vídeo y audio. Por lo tanto, se facilitará al usuario, analista de vídeo forense, información que le ayude a valorar la originalidad del contenido del soporte que es sujeto del análisis. El objetivo específico de esta fase final, es la creación de la interfaz de usuario del software, que informa tanto del código binario de los sectores significativos, como de su interpretación tras el análisis. También permitirá al usuario el reporte de los resultados, además de otras funcionalidades que le permitan la navegación por los sectores del código que han sido modificados como efecto colateral de la edición de la cinta magnética original. Otro objetivo importante del proyecto ha sido la investigación de metodologías y técnicas de desarrollo de software para su posterior implementación, buscando con esto, una mayor eficiencia en la gestión del tiempo y una mayor calidad de software con el fin de garantizar su evolución y sostenibilidad en el futuro. Se ha hecho hincapié en las metodologías ágiles que han ido ganando relevancia en el sector de las tecnologías de la información en las últimas décadas, sustituyendo a metodologías clásicas como el desarrollo en cascada. Su flexibilidad durante el ciclo de vida del software, permite obtener mejores resultados cuando las especificaciones no están del todo definidas, ajustándose de este modo a las condiciones del proyecto. Resumiendo las especificaciones técnicas del software, C++ es el lenguaje de programación orientado a objetos con el que se ha desarrollado, utilizándose la tecnología MFC -Microsoft Foundation Classes- para la implementación. Es un proyecto MFC de tipo cuadro de dialogo,creado, compilado y publicado, con la herramienta de desarrollo integrado Microsoft Visual Studio 2010. La arquitectura con la que se ha estructurado es la arquetípica de tres capas, compuesta por la interfaz de usuario, capa de negocio y capa de acceso a datos. Se ha visto necesario configurar el proyecto con compatibilidad con CLR –Common Languages Runtime- para poder implementar la funcionalidad de creación de reportes. Acompañando a la aplicación informática, se presenta la memoria del proyecto y sus anexos correspondientes a los documentos EDRF –Especificaciones Detalladas de Requisitos funcionales-, EIU –Especificaciones de Interfaz de Usuario , DT -Diseño Técnico- y Guía de Usuario. SUMMARY. This dissertation, carried out by the telecommunications engineer Pedro M. Matamala Lucas, is in its final stage and is part of a larger project for the software of forensic video called SAVID. The purpose of the entire project is the creation of a software tool capable of analyzing video files that are coded and compressed by the DV -Digital Video- System. The objective of the analysis is to provide information on whether the magnetic tape shows signs of having been tampered with after the editing of the original recording, and also to show the user other relevant data and technical specifications of the video signal and audio. Therefore the user, forensic video analyst, will have information to help assess the originality of the content of the media that is subject to analysis. The specific objective of this final phase is the creation of the user interface of the software that provides information about the binary code of the significant sectors and also its interpretation after analysis. It will also allow the user to report the results, and other features that will allow browsing through the sections of the code that have been modified as a secondary effect of the original magnetic tape being tampered. Another important objective of the project is the investigation of methodologies and software development techniques to be used in deployment, with the aim of greater efficiency in time management and enhanced software quality in order to ensure its development and maintenance in the future. Agile methodologies, which have become important in the field of information technology in recent decades, have been used during the execution of the project, replacing classical methodologies such as Waterfall Development. The flexibility, as the result of using by agile methodologies, during the software life cycle, produces better results when the specifications are not fully defined, thus conforming to the initial conditions of the project. Summarizing the software technical specifications, C + + the programming language – which is object oriented and has been developed using technology MFC- Microsoft Foundation Classes for implementation. It is a project type dialog box, created, compiled and released with the integrated development tool Microsoft Visual Studio 2010. The architecture is structured in three layers: the user interface, business layer and data access layer. It has been necessary to configure the project with the support CLR -Common Languages Runtime – in order to implement the reporting functionality. The software application is submitted with the project report and its annexes to the following documents: Functional Requirements Specifications - Detailed User Interface Specifications, Technical Design and User Guide.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Diffusion weighted Imaging (DWI) techniques are able to measure, in vivo and non-invasively, the diffusivity of water molecules inside the human brain. DWI has been applied on cerebral ischemia, brain maturation, epilepsy, multiple sclerosis, etc. [1]. Nowadays, there is a very high availability of these images. DWI allows the identification of brain tissues, so its accurate segmentation is a common initial step for the referred applications. Materials and Methods We present a validation study on automated segmentation of DWI based on the Gaussian mixture and hidden Markov random field models. This methodology is widely solved with iterative conditional modes algorithm, but some studies suggest [2] that graph-cuts (GC) algorithms improve the results when initialization is not close to the final solution. We implemented a segmentation tool integrating ITK with a GC algorithm [3], and a validation software using fuzzy overlap measures [4]. Results Segmentation accuracy of each tool is tested against a gold-standard segmentation obtained from a T1 MPRAGE magnetic resonance image of the same subject, registered to the DWI space. The proposed software shows meaningful improvements by using the GC energy minimization approach on DTI and DSI (Diffusion Spectrum Imaging) data. Conclusions The brain tissues segmentation on DWI is a fundamental step on many applications. Accuracy and robustness improvements are achieved with the proposed software, with high impact on the application’s final result.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esta tesis está incluida dentro del campo del campo de Multiband Orthogonal Frequency Division Multiplexing Ultra Wideband (MB-OFDM UWB), el cual ha adquirido una gran importancia en las comunicaciones inalámbricas de alta tasa de datos en la última década. UWB surgió con el objetivo de satisfacer la creciente demanda de conexiones inalámbricas en interiores y de uso doméstico, con bajo coste y alta velocidad. La disponibilidad de un ancho de banda grande, el potencial para alta velocidad de transmisión, baja complejidad y bajo consumo de energía, unido al bajo coste de implementación, representa una oportunidad única para que UWB se convierta en una solución ampliamente utilizada en aplicaciones de Wireless Personal Area Network (WPAN). UWB está definido como cualquier transmisión que ocupa un ancho de banda de más de 20% de su frecuencia central, o más de 500 MHz. En 2002, la Comisión Federal de Comunicaciones (FCC) definió que el rango de frecuencias de transmisión de UWB legal es de 3.1 a 10.6 GHz, con una energía de transmisión de -41.3 dBm/Hz. Bajo las directrices de FCC, el uso de la tecnología UWB puede aportar una enorme capacidad en las comunicaciones de corto alcance. Considerando las ecuaciones de capacidad de Shannon, incrementar la capacidad del canal requiere un incremento lineal en el ancho de banda, mientras que un aumento similar de la capacidad de canal requiere un aumento exponencial en la energía de transmisión. En los últimos años, s diferentes desarrollos del UWB han sido extensamente estudiados en diferentes áreas, entre los cuales, el protocolo de comunicaciones inalámbricas MB-OFDM UWB está considerado como la mejor elección y ha sido adoptado como estándar ISO/IEC para los WPANs. Combinando la modulación OFDM y la transmisión de datos utilizando las técnicas de salto de frecuencia, el sistema MB-OFDM UWB es capaz de soportar tasas de datos con que pueden variar de los 55 a los 480 Mbps, alcanzando una distancia máxima de hasta 10 metros. Se esperara que la tecnología MB-OFDM tenga un consumo energético muy bajo copando un are muy reducida en silicio, proporcionando soluciones de bajo coste que satisfagan las demandas del mercado. Para cumplir con todas estas expectativas, el desarrollo y la investigación del MBOFDM UWB deben enfrentarse a varios retos, como son la sincronización de alta sensibilidad, las restricciones de baja complejidad, las estrictas limitaciones energéticas, la escalabilidad y la flexibilidad. Tales retos requieren un procesamiento digital de la señal de última generación, capaz de desarrollar sistemas que puedan aprovechar por completo las ventajas del espectro UWB y proporcionar futuras aplicaciones inalámbricas en interiores. Esta tesis se centra en la completa optimización de un sistema de transceptor de banda base MB-OFDM UWB digital, cuyo objetivo es investigar y diseñar un subsistema de comunicación inalámbrica para la aplicación de las Redes de Sensores Inalámbricas Visuales. La complejidad inherente de los procesadores FFT/IFFT y el sistema de sincronización así como la alta frecuencia de operación para todos los elementos de procesamiento, se convierten en el cuello de la botella para el diseño y la implementación del sistema de UWB digital en base de banda basado en MB-OFDM de baja energía. El objetivo del transceptor propuesto es conseguir baja energía y baja complejidad bajo la premisa de un alto rendimiento. Las optimizaciones están realizadas tanto a nivel algorítmico como a nivel arquitectural para todos los elementos del sistema. Una arquitectura hardware eficiente en consumo se propone en primer lugar para aquellos módulos correspondientes a núcleos de computación. Para el procesado de la Transformada Rápida de Fourier (FFT/IFFT), se propone un algoritmo mixed-radix, basado en una arquitectura con pipeline y se ha desarrollado un módulo de Decodificador de Viterbi (VD) equilibrado en coste-velocidad con el objetivo de reducir el consumo energético e incrementar la velocidad de procesamiento. También se ha implementado un correlador signo-bit simple basado en la sincronización del tiempo de símbolo es presentado. Este correlador es usado para detectar y sincronizar los paquetes de OFDM de forma robusta y precisa. Para el desarrollo de los subsitemas de procesamiento y realizar la integración del sistema completo se han empleado tecnologías de última generación. El dispositivo utilizado para el sistema propuesto es una FPGA Virtex 5 XC5VLX110T del fabricante Xilinx. La validación el propuesta para el sistema transceptor se ha implementado en dicha placa de FPGA. En este trabajo se presenta un algoritmo, y una arquitectura, diseñado con filosofía de co-diseño hardware/software para el desarrollo de sistemas de FPGA complejos. El objetivo principal de la estrategia propuesta es de encontrar una metodología eficiente para el diseño de un sistema de FPGA configurable optimizado con el empleo del mínimo esfuerzo posible en el sistema de procedimiento de verificación, por tanto acelerar el periodo de desarrollo del sistema. La metodología de co-diseño presentada tiene la ventaja de ser fácil de usar, contiene todos los pasos desde la propuesta del algoritmo hasta la verificación del hardware, y puede ser ampliamente extendida para casi todos los tipos de desarrollos de FPGAs. En este trabajo se ha desarrollado sólo el sistema de transceptor digital de banda base por lo que la comprobación de señales transmitidas a través del canal inalámbrico en los entornos reales de comunicación sigue requiriendo componentes RF y un front-end analógico. No obstante, utilizando la metodología de co-simulación hardware/software citada anteriormente, es posible comunicar el sistema de transmisor y el receptor digital utilizando los modelos de canales propuestos por IEEE 802.15.3a, implementados en MATLAB. Por tanto, simplemente ajustando las características de cada modelo de canal, por ejemplo, un incremento del retraso y de la frecuencia central, podemos estimar el comportamiento del sistema propuesto en diferentes escenarios y entornos. Las mayores contribuciones de esta tesis son: • Se ha propuesto un nuevo algoritmo 128-puntos base mixto FFT usando la arquitectura pipeline multi-ruta. Los complejos multiplicadores para cada etapa de procesamiento son diseñados usando la arquitectura modificada shiftadd. Los sistemas word length y twiddle word length son comparados y seleccionados basándose en la señal para cuantización del SQNR y el análisis de energías. • El desempeño del procesador IFFT es analizado bajo diferentes situaciones aritméticas de bloques de punto flotante (BFP) para el control de desbordamiento, por tanto, para encontrar la arquitectura perfecta del algoritmo IFFT basado en el procesador FFT propuesto. • Para el sistema de receptor MB-OFDM UWB se ha empleado una sincronización del tiempo innovadora, de baja complejidad y esquema de compensación, que consiste en funciones de Detector de Paquetes (PD) y Estimación del Offset del tiempo. Simplificando el cross-correlation y maximizar las funciones probables solo a sign-bit, la complejidad computacional se ve reducida significativamente. • Se ha propuesto un sistema de decodificadores Viterbi de 64 estados de decisión-débil usando velocidad base-4 de arquitectura suma-comparaselecciona. El algoritmo Two-pointer Even también es introducido en la unidad de rastreador de origen con el objetivo de conseguir la eficiencia en el hardware. • Se han integrado varias tecnologías de última generación en el completo sistema transceptor basebanda , con el objetivo de implementar un sistema de comunicación UWB altamente optimizado. • Un diseño de flujo mejorado es propuesto para el complejo sistema de implementación, el cual puede ser usado para diseños de Cadena de puertas de campo programable general (FPGA). El diseño mencionado no sólo reduce dramáticamente el tiempo para la verificación funcional, sino también provee un análisis automático como los errores del retraso del output para el sistema de hardware implementado. • Un ambiente de comunicación virtual es establecido para la validación del propuesto sistema de transceptores MB-OFDM. Este método es provisto para facilitar el uso y la conveniencia de analizar el sistema digital de basebanda sin parte frontera analógica bajo diferentes ambientes de comunicación. Esta tesis doctoral está organizada en seis capítulos. En el primer capítulo se encuentra una breve introducción al campo del UWB, tanto relacionado con el proyecto como la motivación del desarrollo del sistema de MB-OFDM. En el capítulo 2, se presenta la información general y los requisitos del protocolo de comunicación inalámbrica MBOFDM UWB. En el capítulo 3 se habla de la arquitectura del sistema de transceptor digital MB-OFDM de banda base . El diseño del algoritmo propuesto y la arquitectura para cada elemento del procesamiento está detallado en este capítulo. Los retos de diseño del sistema que involucra un compromiso de discusión entre la complejidad de diseño, el consumo de energía, el coste de hardware, el desempeño del sistema, y otros aspectos. En el capítulo 4, se ha descrito la co-diseñada metodología de hardware/software. Cada parte del flujo del diseño será detallado con algunos ejemplos que se ha hecho durante el desarrollo del sistema. Aprovechando esta estrategia de diseño, el procedimiento de comunicación virtual es llevado a cabo para probar y analizar la arquitectura del transceptor propuesto. Los resultados experimentales de la co-simulación y el informe sintético de la implementación del sistema FPGA son reflejados en el capítulo 5. Finalmente, en el capítulo 6 se incluye las conclusiones y los futuros proyectos, y también los resultados derivados de este proyecto de doctorado. ABSTRACT In recent years, the Wireless Visual Sensor Network (WVSN) has drawn great interest in wireless communication research area. They enable a wealth of new applications such as building security control, image sensing, and target localization. However, nowadays wireless communication protocols (ZigBee, Wi-Fi, and Bluetooth for example) cannot fully satisfy the demands of high data rate, low power consumption, short range, and high robustness requirements. New communication protocol is highly desired for such kind of applications. The Ultra Wideband (UWB) wireless communication protocol, which has increased in importance for high data rate wireless communication field, are emerging as an important topic for WVSN research. UWB has emerged as a technology that offers great promise to satisfy the growing demand for low-cost, high-speed digital wireless indoor and home networks. The large bandwidth available, the potential for high data rate transmission, and the potential for low complexity and low power consumption, along with low implementation cost, all present a unique opportunity for UWB to become a widely adopted radio solution for future Wireless Personal Area Network (WPAN) applications. UWB is defined as any transmission that occupies a bandwidth of more than 20% of its center frequency, or more than 500 MHz. In 2002, the Federal Communications Commission (FCC) has mandated that UWB radio transmission can legally operate in the range from 3.1 to 10.6 GHz at a transmitter power of -41.3 dBm/Hz. Under the FCC guidelines, the use of UWB technology can provide enormous capacity over short communication ranges. Considering Shannon’s capacity equations, increasing the channel capacity requires linear increasing in bandwidth, whereas similar channel capacity increases would require exponential increases in transmission power. In recent years, several different UWB developments has been widely studied in different area, among which, the MB-OFDM UWB wireless communication protocol is considered to be the leading choice and has recently been adopted in the ISO/IEC standard for WPANs. By combing the OFDM modulation and data transmission using frequency hopping techniques, the MB-OFDM UWB system is able to support various data rates, ranging from 55 to 480 Mbps, over distances up to 10 meters. The MB-OFDM technology is expected to consume very little power and silicon area, as well as provide low-cost solutions that can satisfy consumer market demands. To fulfill these expectations, MB-OFDM UWB research and development have to cope with several challenges, which consist of high-sensitivity synchronization, low- complexity constraints, strict power limitations, scalability, and flexibility. Such challenges require state-of-the-art digital signal processing expertise to develop systems that could fully take advantages of the UWB spectrum and support future indoor wireless applications. This thesis focuses on fully optimization for the MB-OFDM UWB digital baseband transceiver system, aiming at researching and designing a wireless communication subsystem for the Wireless Visual Sensor Networks (WVSNs) application. The inherent high complexity of the FFT/IFFT processor and synchronization system, and high operation frequency for all processing elements, becomes the bottleneck for low power MB-OFDM based UWB digital baseband system hardware design and implementation. The proposed transceiver system targets low power and low complexity under the premise of high performance. Optimizations are made at both algorithm and architecture level for each element of the transceiver system. The low-power hardwareefficient structures are firstly proposed for those core computation modules, i.e., the mixed-radix algorithm based pipelined architecture is proposed for the Fast Fourier Transform (FFT/IFFT) processor, and the cost-speed balanced Viterbi Decoder (VD) module is developed, in the aim of lowering the power consumption and increasing the processing speed. In addition, a low complexity sign-bit correlation based symbol timing synchronization scheme is presented so as to detect and synchronize the OFDM packets robustly and accurately. Moreover, several state-of-the-art technologies are used for developing other processing subsystems and an entire MB-OFDM digital baseband transceiver system is integrated. The target device for the proposed transceiver system is Xilinx Virtex 5 XC5VLX110T FPGA board. In order to validate the proposed transceiver system in the FPGA board, a unified algorithm-architecture-circuit hardware/software co-design environment for complex FPGA system development is presented in this work. The main objective of the proposed strategy is to find an efficient methodology for designing a configurable optimized FPGA system by using as few efforts as possible in system verification procedure, so as to speed up the system development period. The presented co-design methodology has the advantages of easy to use, covering all steps from algorithm proposal to hardware verification, and widely spread for almost all kinds of FPGA developments. Because only the digital baseband transceiver system is developed in this thesis, the validation of transmitting signals through wireless channel in real communication environments still requires the analog front-end and RF components. However, by using the aforementioned hardware/software co-simulation methodology, the transmitter and receiver digital baseband systems get the opportunity to communicate with each other through the channel models, which are proposed from the IEEE 802.15.3a research group, established in MATLAB. Thus, by simply adjust the characteristics of each channel model, e.g. mean excess delay and center frequency, we can estimate the transmission performance of the proposed transceiver system through different communication situations. The main contributions of this thesis are: • A novel mixed radix 128-point FFT algorithm by using multipath pipelined architecture is proposed. The complex multipliers for each processing stage are designed by using modified shift-add architectures. The system wordlength and twiddle word-length are compared and selected based on Signal to Quantization Noise Ratio (SQNR) and power analysis. • IFFT processor performance is analyzed under different Block Floating Point (BFP) arithmetic situations for overflow control, so as to find out the perfect architecture of IFFT algorithm based on the proposed FFT processor. • An innovative low complex timing synchronization and compensation scheme, which consists of Packet Detector (PD) and Timing Offset Estimation (TOE) functions, for MB-OFDM UWB receiver system is employed. By simplifying the cross-correlation and maximum likelihood functions to signbit only, the computational complexity is significantly reduced. • A 64 state soft-decision Viterbi Decoder system by using high speed radix-4 Add-Compare-Select architecture is proposed. Two-pointer Even algorithm is also introduced into the Trace Back unit in the aim of hardware-efficiency. • Several state-of-the-art technologies are integrated into the complete baseband transceiver system, in the aim of implementing a highly-optimized UWB communication system. • An improved design flow is proposed for complex system implementation which can be used for general Field-Programmable Gate Array (FPGA) designs. The design method not only dramatically reduces the time for functional verification, but also provides automatic analysis such as errors and output delays for the implemented hardware systems. • A virtual communication environment is established for validating the proposed MB-OFDM transceiver system. This methodology is proved to be easy for usage and convenient for analyzing the digital baseband system without analog frontend under different communication environments. This PhD thesis is organized in six chapters. In the chapter 1 a brief introduction to the UWB field, as well as the related work, is done, along with the motivation of MBOFDM system development. In the chapter 2, the general information and requirement of MB-OFDM UWB wireless communication protocol is presented. In the chapter 3, the architecture of the MB-OFDM digital baseband transceiver system is presented. The design of the proposed algorithm and architecture for each processing element is detailed in this chapter. Design challenges of such system involve trade-off discussions among design complexity, power consumption, hardware cost, system performance, and some other aspects. All these factors are analyzed and discussed. In the chapter 4, the hardware/software co-design methodology is proposed. Each step of this design flow will be detailed by taking some examples that we met during system development. Then, taking advantages of this design strategy, the Virtual Communication procedure is carried out so as to test and analyze the proposed transceiver architecture. Experimental results from the co-simulation and synthesis report of the implemented FPGA system are given in the chapter 5. The chapter 6 includes conclusions and future work, as well as the results derived from this PhD work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The growing interest for integrating agile methodologies and usability has brought various challenges to practitioners. This research focuses on a specific part of these challenges that is related to the integration of usability mechanisms (features such as cancel, undo, warning, etc.) into agile requirements, usually written in the form of user stories. For this aim, a framework has been developed, conformed first by a well-defined modeling language that aims to formalize previous empirical research in the field, models of the impact of usability mechanisms into user stories, and a tool to help practitioners applying them to user stories. Results show that the use of this framework helps agile developers to think about usability from the beginning of the development process, without needing to be an expert in the subject. Our proposal can therefore complement other usability practices to improve the quality of use of software developed using agile methodologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new version of the TomoRebuild data reduction software package is presented, for the reconstruction of scanning transmission ion microscopy tomography (STIMT) and particle induced X-ray emission tomography (PIXET) images. First, we present a state of the art of the reconstruction codes available for ion beam microtomography. The algorithm proposed here brings several advantages. It is a portable, multi-platform code, designed in C++ with well-separated classes for easier use and evolution. Data reduction is separated in different steps and the intermediate results may be checked if necessary. Although no additional graphic library or numerical tool is required to run the program as a command line, a user friendly interface was designed in Java, as an ImageJ plugin. All experimental and reconstruction parameters may be entered either through this plugin or directly in text format files. A simple standard format is proposed for the input of experimental data. Optional graphic applications using the ROOT interface may be used separately to display and fit energy spectra. Regarding the reconstruction process, the filtered backprojection (FBP) algorithm, already present in the previous version of the code, was optimized so that it is about 10 times as fast. In addition, Maximum Likelihood Expectation Maximization (MLEM) and its accelerated version Ordered Subsets Expectation Maximization (OSEM) algorithms were implemented. A detailed user guide in English is available. A reconstruction example of experimental data from a biological sample is given. It shows the capability of the code to reduce noise in the sinograms and to deal with incomplete data, which puts a new perspective on tomography using low number of projections or limited angle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automated and semi-automated accessibility evaluation tools are key to streamline the process of accessibility assessment, and ultimately ensure that software products, contents, and services meet accessibility requirements. Different evaluation tools may better fit different needs and concerns, accounting for a variety of corporate and external policies, content types, invocation methods, deployment contexts, exploitation models, intended audiences and goals; and the specific overall process where they are introduced. This has led to the proliferation of many evaluation tools tailored to specific contexts. However, tool creators, who may be not familiar with the realm of accessibility and may be part of a larger project, lack any systematic guidance when facing the implementation of accessibility evaluation functionalities. Herein we present a systematic approach to the development of accessibility evaluation tools, leveraging the different artifacts and activities of a standardized development process model (the Unified Software Development Process), and providing templates of these artifacts tailored to accessibility evaluation tools. The work presented specially considers the work in progress in this area by the W3C/WAI Evaluation and Report Working Group (ERT WG)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the authors? experience with static analysis of both WCET and stack usage of a satellite on-board software subsystem. The work is a continuation of a previous case study that used a dynamic WCET analysis tool on an earlier version of the same software system. In particular, the AbsInt aiT tool has been evaluated by analysing both C and Ada code generated by Simulink within the UPMSat-2 project. Some aspects of the aiT tool, specifically those dealing with SPARC register windows, are compared to another static analysis tool, Bound-T. The results of the analysis are discussed, and some conclusions on the use of static WCET analysis tools on the SPARC architecture are commented in the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper will present an open-source simulation tool, which is being developed in the frame of an European research project1. The tool, whose final version will be freely available through a website, allows the modelling and the design of different types of grid-connected PV systems, such as large grid-connected plants and building-integrated installations. The tool is based on previous software developed by the IES-UPM2, whose models and energy losses scenarios have been validated in the commissioning of PV projects3 carried out in Spain, Portugal, France and Italy, whose aggregated capacity is nearly 300MW. This link between design and commissioning is one of the key points of tool presented here, which is not usually addressed by present commercial software. The tool provides, among other simulation results, the energy yield, the analysis and breakdown of energy losses, and the estimations of financial returns adapted to the legal and financial frameworks of each European country. Besides, educational facilities will be developed and integrated in the tool, not only devoted to learn how to use this software, but also to train the users on the best design PV systems practices. The tool will also include the recommendation of several PV community experts, which have been invited to identify present necessities in the field of PV systems simulation. For example, the possibility of using meteorological forecasts as input data, or modelling the integration of large energy storage systems, such as vanadium redox or lithium-ion batteries. Finally, it is worth mentioning that during the verification and testing stages of this software development, it will be also open to the suggestions received from the different actors of the PV community, such as promoters, installers, consultants, etc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neuro-evolutive development from birth until the age of six years is a decisive factor in a child?s quality of life. Early detection of development disorders in early childhood can facilitate necessary diagnosis and/or treatment. Primary-care pediatricians play a key role in its detection as they can undertake the preventive and therapeutic actions requested to promote a child?s optimal development. However, the lack of time and little specific knowledge at primary-care avoid to applying continuous early-detection anomalies procedures. This research paper focuses on the deployment and evaluation of a smart system that enhances the screening of language disorders in primary care. Pediatricians get support to proceed with early referral of language disorders. The proposed model provides them with a decision-support tool for referral actions to trigger essential diagnostic and/or therapeutic actions for a comprehensive individual development. The research was conducted by starting from a sample of 60 cases of children with language disorders. Validation was carried out through two complementary steps: first, by including a team of seven experts from the fields of neonatology, pediatrics, neurology and language therapy, and, second, through the evaluation of 21 more previously diagnosed cases. The results obtained show that therapist positively accepted the system proposal in 18 cases (86%) and suggested system redesign for single referral to a speech therapist in three remaining cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An accepted fact in software engineering is that software must undergo verification and validation process during development to ascertain and improve its quality level. But there are too many techniques than a single developer could master, yet, it is impossible to be certain that software is free of defects. So, it is crucial for developers to be able to choose from available evaluation techniques, the one most suitable and likely to yield optimum quality results for different products. Though, some knowledge is available on the strengths and weaknesses of the available software quality assurance techniques but not much is known yet on the relationship between different techniques and contextual behavior of the techniques. Objective: This research investigates the effectiveness of two testing techniques ? equivalence class partitioning and decision coverage and one review technique ? code review by abstraction, in terms of their fault detection capability. This will be used to strengthen the practical knowledge available on these techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spanish wheat (Triticum spp.) landraces have a considerable polymorphism, containing many unique alleles, relative to other collections. The existence of a core collection is a favored approach for breeders to efficiently explore novel variation and enhance the use of germplasm. In this study, the Spanish durum wheat (Triticum turgidum L.) core collection (CC) was created using a population structure–based method, grouping accessions by subspecies and allocating the number of genotypes among populations according to the diversity of simple sequence repeat (SSR) markers. The CC of 94 genotypes was established, which accounted for 17% of the accessions in the entire collection. An alternative core collection (CH), with the same number of genotypes per subspecies and maximizing the coverage of SSR alleles, was assembled with the Core Hunter software. The quality of both core collections was compared with a random core collection and evaluated using geographic, agromorphological, and molecular marker data not previously used in the selection of genotypes. Both core collections had a high genetic representativeness, which validated their sampling strategies. Geographic and agromorphological variation, phenotypic correlations, and gliadin alleles of the original collection were more accurately depicted by the CC. Diversity arrays technology (DArT) markers revealed that the CC included genotypes less similar than the CH. Although more SSR alleles were retained by the CH (94%) than by the CC (91%), the results showed that the CC was better than CH for breeding purposes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Gray scale images make the bulk of data in bio-medical image analysis, and hence, the main focus of many image processing tasks lies in the processing of these monochrome images. With ever improving acquisition devices, spatial and temporal image resolution increases, and data sets become very large. Various image processing frameworks exists that make the development of new algorithms easy by using high level programming languages or visual programming. These frameworks are also accessable to researchers that have no background or little in software development because they take care of otherwise complex tasks. Specifically, the management of working memory is taken care of automatically, usually at the price of requiring more it. As a result, processing large data sets with these tools becomes increasingly difficult on work station class computers. One alternative to using these high level processing tools is the development of new algorithms in a languages like C++, that gives the developer full control over how memory is handled, but the resulting workflow for the prototyping of new algorithms is rather time intensive, and also not appropriate for a researcher with little or no knowledge in software development. Another alternative is in using command line tools that run image processing tasks, use the hard disk to store intermediate results, and provide automation by using shell scripts. Although not as convenient as, e.g. visual programming, this approach is still accessable to researchers without a background in computer science. However, only few tools exist that provide this kind of processing interface, they are usually quite task specific, and don’t provide an clear approach when one wants to shape a new command line tool from a prototype shell script. Results The proposed framework, MIA, provides a combination of command line tools, plug-ins, and libraries that make it possible to run image processing tasks interactively in a command shell and to prototype by using the according shell scripting language. Since the hard disk becomes the temporal storage memory management is usually a non-issue in the prototyping phase. By using string-based descriptions for filters, optimizers, and the likes, the transition from shell scripts to full fledged programs implemented in C++ is also made easy. In addition, its design based on atomic plug-ins and single tasks command line tools makes it easy to extend MIA, usually without the requirement to touch or recompile existing code. Conclusion In this article, we describe the general design of MIA, a general purpouse framework for gray scale image processing. We demonstrated the applicability of the software with example applications from three different research scenarios, namely motion compensation in myocardial perfusion imaging, the processing of high resolution image data that arises in virtual anthropology, and retrospective analysis of treatment outcome in orthognathic surgery. With MIA prototyping algorithms by using shell scripts that combine small, single-task command line tools is a viable alternative to the use of high level languages, an approach that is especially useful when large data sets need to be processed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background DCE@urLAB is a software application for analysis of dynamic contrast-enhanced magnetic resonance imaging data (DCE-MRI). The tool incorporates a friendly graphical user interface (GUI) to interactively select and analyze a region of interest (ROI) within the image set, taking into account the tissue concentration of the contrast agent (CA) and its effect on pixel intensity. Results Pixel-wise model-based quantitative parameters are estimated by fitting DCE-MRI data to several pharmacokinetic models using the Levenberg-Marquardt algorithm (LMA). DCE@urLAB also includes the semi-quantitative parametric and heuristic analysis approaches commonly used in practice. This software application has been programmed in the Interactive Data Language (IDL) and tested both with publicly available simulated data and preclinical studies from tumor-bearing mouse brains. Conclusions A user-friendly solution for applying pharmacokinetic and non-quantitative analysis DCE-MRI in preclinical studies has been implemented and tested. The proposed tool has been specially designed for easy selection of multi-pixel ROIs. A public release of DCE@urLAB, together with the open source code and sample datasets, is available at http://www.die.upm.es/im/archives/DCEurLAB/ webcite.