827 resultados para Verification and validation technology


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cost, performance and availability considerations are forcing even the most conservative high-integrity embedded real-time systems industry to migrate from simple hardware processors to ones equipped with caches and other acceleration features. This migration disrupts the practices and solutions that industry had developed and consolidated over the years to perform timing analysis. Industry that are confident with the efficiency/effectiveness of their verification and validation processes for old-generation processors, do not have sufficient insight on the effects of the migration to cache-equipped processors. Caches are perceived as an additional source of complexity, which has potential for shattering the guarantees of cost- and schedule-constrained qualification of their systems. The current industrial approach to timing analysis is ill-equipped to cope with the variability incurred by caches. Conversely, the application of advanced WCET analysis techniques on real-world industrial software, developed without analysability in mind, is hardly feasible. We propose a development approach aimed at minimising the cache jitters, as well as at enabling the application of advanced WCET analysis techniques to industrial systems. Our approach builds on:(i) identification of those software constructs that may impede or complicate timing analysis in industrial-scale systems; (ii) elaboration of practical means, under the model-driven engineering (MDE) paradigm, to enforce the automated generation of software that is analyzable by construction; (iii) implementation of a layout optimisation method to remove cache jitters stemming from the software layout in memory, with the intent of facilitating incremental software development, which is of high strategic interest to industry. The integration of those constituents in a structured approach to timing analysis achieves two interesting properties: the resulting software is analysable from the earliest releases onwards - as opposed to becoming so only when the system is final - and more easily amenable to advanced timing analysis by construction, regardless of the system scale and complexity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tanto los robots autónomos móviles como los robots móviles remotamente operados se utilizan con éxito actualmente en un gran número de ámbitos, algunos de los cuales son tan dispares como la limpieza en el hogar, movimiento de productos en almacenes o la exploración espacial. Sin embargo, es difícil garantizar la ausencia de defectos en los programas que controlan dichos dispositivos, al igual que ocurre en otros sectores informáticos. Existen diferentes alternativas para medir la calidad de un sistema en el desempeño de las funciones para las que fue diseñado, siendo una de ellas la fiabilidad. En el caso de la mayoría de los sistemas físicos se detecta una degradación en la fiabilidad a medida que el sistema envejece. Esto es debido generalmente a efectos de desgaste. En el caso de los sistemas software esto no suele ocurrir, ya que los defectos que existen en ellos generalmente no han sido adquiridos con el paso del tiempo, sino que han sido insertados en el proceso de desarrollo de los mismos. Si dentro del proceso de generación de un sistema software se focaliza la atención en la etapa de codificación, podría plantearse un estudio que tratara de determinar la fiabilidad de distintos algoritmos, válidos para desempeñar el mismo cometido, según los posibles defectos que pudieran introducir los programadores. Este estudio básico podría tener diferentes aplicaciones, como por ejemplo elegir el algoritmo menos sensible a los defectos, para el desarrollo de un sistema crítico o establecer procedimientos de verificación y validación, más exigentes, si existe la necesidad de utilizar un algoritmo que tenga una alta sensibilidad a los defectos. En el presente trabajo de investigación se ha estudiado la influencia que tienen determinados tipos de defectos software en la fiabilidad de tres controladores de velocidad multivariable (PID, Fuzzy y LQR) al actuar en un robot móvil específico. La hipótesis planteada es que los controladores estudiados ofrecen distinta fiabilidad al verse afectados por similares patrones de defectos, lo cual ha sido confirmado por los resultados obtenidos. Desde el punto de vista de la planificación experimental, en primer lugar se realizaron los ensayos necesarios para determinar si los controladores de una misma familia (PID, Fuzzy o LQR) ofrecían una fiabilidad similar, bajo las mismas condiciones experimentales. Una vez confirmado este extremo, se eligió de forma aleatoria un representante de clase de cada familia de controladores, para efectuar una batería de pruebas más exhaustiva, con el objeto de obtener datos que permitieran comparar de una forma más completa la fiabilidad de los controladores bajo estudio. Ante la imposibilidad de realizar un elevado número de pruebas con un robot real, así como para evitar daños en un dispositivo que generalmente tiene un coste significativo, ha sido necesario construir un simulador multicomputador del robot. Dicho simulador ha sido utilizado tanto en las actividades de obtención de controladores bien ajustados, como en la realización de los diferentes ensayos necesarios para el experimento de fiabilidad. ABSTRACT Autonomous mobile robots and remotely operated robots are used successfully in very diverse scenarios, such as home cleaning, movement of goods in warehouses or space exploration. However, it is difficult to ensure the absence of defects in programs controlling these devices, as it happens in most computer sectors. There exist different quality measures of a system when performing the functions for which it was designed, among them, reliability. For most physical systems, a degradation occurs as the system ages. This is generally due to the wear effect. In software systems, this does not usually happen, and defects often come from system development and not from use. Let us assume that we focus on the coding stage in the software development pro¬cess. We could consider a study to find out the reliability of different and equally valid algorithms, taking into account any flaws that programmers may introduce. This basic study may have several applications, such as choosing the algorithm less sensitive to pro¬gramming defects for the development of a critical system. We could also establish more demanding procedures for verification and validation if we need an algorithm with high sensitivity to programming defects. In this thesis, we studied the influence of certain types of software defects in the reliability of three multivariable speed controllers (PID, Fuzzy and LQR) designed to work in a specific mobile robot. The hypothesis is that similar defect patterns affect differently the reliability of controllers, and it has been confirmed by the results. From the viewpoint of experimental planning, we followed these steps. First, we conducted the necessary test to determine if controllers of the same family (PID, Fuzzy or LQR) offered a similar reliability under the same experimental conditions. Then, a class representative was chosen at ramdom within each controller family to perform a more comprehensive test set, with the purpose of getting data to compare more extensively the reliability of the controllers under study. The impossibility of performing a large number of tests with a real robot and the need to prevent the damage of a device with a significant cost, lead us to construct a multicomputer robot simulator. This simulator has been used to obtain well adjusted controllers and to carry out the required reliability experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El Hogar Digital Accesible (HDA) de la ETSIST nace con el propósito de acercar las nuevas Tecnologías de la Información a las personas que precisan de necesidades concretas de accesibilidad y usabilidad, dotándoles de herramientas que les permitan aumentar su calidad de vida, confort, seguridad y autonomía. El entorno del HDA consta de elementos de control para puertas, persianas, iluminación, agua o gas, sensores de temperatura, incendios, gas, sistemas de climatización, sistemas de entretenimiento y sistemas de seguridad tales como detectores de presencia y alarmas. Todo ello apoyado sobre una arquitectura de red que proporciona una pasarela residencial y un acceso a banda ancha. El objetivo principal de este PFG ha sido el desarrollo de un sistema de autenticación para el Hogar Digital Accesible de bajo coste. La idea de integrar un sistema de autenticación en el HDA, surge de la necesidad de proteger de accesos no deseados determinados servicios disponibles dentro de un ámbito privado. Algunos de estos servicios pueden ser tales como el acceso a la lectura de los mensajes disponibles en el contestador automático, el uso de equipos multimedia, la desconexión de alarmas de seguridad o simplemente la configuración de ambientes según el usuario que esté autenticado (intensidad de luz, temperatura de la sala, etc.). En el desarrollo han primado los principios de accesibilidad, usabilidad y seguridad necesarios para la creación de un entorno no invasivo, que permitiera acreditar la identidad del usuario frente al sistema HDA. Se ha planteado como posible solución, un sistema basado en el reconocimiento de un trazo realizado por el usuario. Este trazo se usará como clave de cara a validar a los usuarios. El usuario deberá repetir el trazado que registró en el sistema para autenticarse. Durante la ejecución del presente PFG, se justificará la elección de este mecanismo de autenticación frente a otras alternativas disponibles en el mercado. Para probar la aplicación, se ha podido contar con dos periféricos de distintas gamas, el uDraw creado para la PS3 que se compone de una tableta digitalizadora y un lápiz que permite recoger los trazos realizados por el usuario de forma inalámbrica y la tableta digitalizadora Bamboo de Wacom. La herramienta desarrollada permite a su vez, la posibilidad de ser usada por otro tipo de dispositivos como es el caso del reloj con acelerómetro de 3 ejes de Texas Instruments Chronos eZ430 capaz de trasladar los movimientos del usuario al puntero de un ratón. El PFG se encuentra dividido en tres grandes bloques de flujo de trabajo. El primero se centra en el análisis del sistema y las tecnologías que lo componen, incluyendo los distintos algoritmos disponibles para realizar la autenticación basada en reconocimiento de patrones aplicados a imágenes que mejor se adaptan a las necesidades del usuario. En el segundo bloque se recoge una versión de prueba basada en el análisis y el diseño UML realizado previamente, sobre la que se efectuaron pruebas de concepto y se comprobó la viabilidad del proyecto. El último bloque incluye la verificación y validación del sistema mediante pruebas que certifican que se han alcanzado los niveles de calidad necesarios para la consecución de los objetivos planteados, generando finalmente la documentación necesaria. Como resultado del trabajo realizado, se ha obtenido un sistema que plantea una arquitectura fácilmente ampliable lograda a través del uso de técnicas como la introspección, que permiten separar la lógica de la capa de negocio del código que la implementa, pudiendo de forma simple e intuitiva sustituir código mediante ficheros de configuración, lo que hace que el sistema sea flexible y escalable. Tras la realización del PFG, se puede concluir que el producto final obtenido ha respondido de forma satisfactoria alcanzando los niveles de calidad requeridos, siendo capaz de proporcionar un sistema de autenticación alternativo a los convencionales, manteniendo unas cotas de seguridad elevadas y haciendo de la accesibilidad y el precio sus características más reseñables. ABSTRACT. Accessible Digital Home (HDA) of the ETSIST was created with the aim of bringing the latest information and communications technologies closer to the people who has special needs of accessibility and usability increasing their quality of life, comfort, security and autonomy. The HDA environment has different control elements for doors, blinds, lighting, water or gas, temperature sensors, fire protection systems, gas flashover, air conditioning systems, entertainments systems and security systems such as intruders detectors and alarms. Everything supported by an architecture net which provides a broadband residential services gateway. The main goal of this PFG was the development of a low-cost authentication system for the Accessible Digital Home. The idea of integrating an authentication system on the HDA, stems from the need to safeguard certain private key network resources from unauthorized access. Some of said resources are the access to the answering machine messages, the use of multimedia devices, the alarms deactivation or the parameter settings for each environment as programmed by the authenticated user (light intensity, room temperature, etc.). During the development priority was given to concepts like accessibility, usability and security. All of them necessary to create a non invasive environment that allows the users to certify their identity. A system based on stroke pattern recognition, was considered as a possible solution. This stroke is used as a key to validate users. The user must repeat the stroke that was saved on the system to validate access. The selection of this authentication mechanism among the others available options will be justified during this PFG. Two peripherals with different ranges were used to test the application. One of them was uDraw design for the PS3. It is wireless and is formed by a pen and a drawing tablet that allow us to register the different strokes drawn by the user. The other one was the Wacom Bamboo tablet, that supports the same functionality but with better accuracy. The developed tool allows another kind of peripherals like the 3-axes accelerometer digital wristwatch Texas Instruments Chronos eZ430 capable of transfering user movements to the mouse cursor. The PFG is divided by three big blocks that represent different workflows. The first block is focused on the system analysis and the technologies related to it, including algorithms for image pattern recognition that fits the user's needs. The second block describes how the beta version was developed based on the UML analysis and design previously done. It was tested and the viability of the project was verified. The last block contains the system verification and validation. These processes certify that the requirements have been fulfilled as well as the quality levels needed to reach the planned goals. Finally all the documentation has been produced. As a result of the work, an expandable system has been created, due to the introspection that provides the opportunity to separate the business logic from the code that implements it. With this technique, the code could be replaced throughout configuration files which makes the system flexible and highly scalable. Once the PFG has finished, it must therefore be concluded that the final product has been a success and high levels of quality have been achieved. This authentication tool gives us a low-cost alternative to the conventional ones. The new authentication system remains security levels reasonably high giving particular emphasis to the accessibility and the price.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La Ingeniería de Pruebas está especializada en la verificación y validación del Software,y formalmente se define como: “Proceso de desarrollo que emplea métodos rigurosos para evaluar la corrección y calidad del producto a lo largo de todo su ciclo de vida” [3]. Este proceso comprende un conjunto de métodos, procedimientos y técnicas formalmente definidas las cuales, usadas de forma sistemática, facilitan la identificación de la mayor cantidad de errores y fallos posibles de un software. Un software que pase un proceso riguroso de pruebas es un producto de calidad que seguramente facilitará la labor del Ingeniero de Software en la corrección de futuras incidencias, algunas de ellas generadas tras la implantación en el entorno real. Este proceso constituye un área de la Ingeniería del Software y una especialidad por tanto, de la misma. De forma simple, la consecución de una correcta Verificación y Validación del Software requiere de algunas actividades imprescindibles como: - Realizar un plan de pruebas del proyecto. - Actualizar dicho plan y corregirlo en caso necesario. - Revisar los documentos de análisis de requisitos. - Ejecutar las pruebas en las diferentes fases del desarrollo del proyecto. - Documentar el diseño y la ejecución de las pruebas. - Generar documentos con los resultados y anomalías de las pruebas ya ejecutadas. Actualmente, la Ingeniería de Pruebas no es muy reconocida como área de trabajo independiente sino más bien, un área inmersa dentro de la Ingeniería de Software. En el entorno laboral existe el perfil de Ingeniero de Pruebas, sin embargo pocos ingenieros de software tienen claro querer ser Ingenieros de Pruebas (probadores o testers) debido a que nunca han tenido la oportunidad de enfrentarse a actividades prácticas reales dentro de los centros de estudios universitarios donde cursan la carrera. Al ser un área de inherente ejercicio profesional, la parte correspondiente de la Ingeniería de Pruebas suele enfocarse desde un punto de vista teórico más que práctico. Hay muchas herramientas para la creación de pruebas y de ayuda para los ingenieros de pruebas, pero la mayoría son de pago o hechas a medida para grandes empresas que necesitan dicho software. Normalmente la gente conoce lo que es la Ingeniería de Pruebas únicamente cuando se empieza a adquirir experiencia en dicha área en el ejercicio profesional dentro de una empresa. Con lo cual, el acercamiento durante la carrera no necesariamente le ha ofrecido al profesional en Ingeniería, la oportunidad de trabajar en esta rama de la Ingeniería del Software y en algunos casos, NOVATests: Metodología y herramienta software de apoyo para los Ingenieros de Prueba Junior 4 los recién egresados comienzan su vida profesional con algún desconocimiento en este sentido. Es por el conjunto de estas razones, que mi intención en este proyecto es proponer una metodología y una herramienta software de apoyo a dicha metodología, para que los estudiantes de carreras de Ingeniería Software y afines, e ingenieros recién egresados con poca experiencia o ninguna en esta área (Ingenieros de Pruebas Junior), puedan poner en práctica las actividades de la Ingeniería de Pruebas dentro de un entorno lo más cercano posible al ejercicio de la labor profesional. De esta forma, podrían desarrollar las tareas propias de dicha área de una manera fácil e intuitiva, favoreciendo un mayor conocimiento y experiencia de la misma. ABSTRACT The software engineering is specialized in the verification and validation of Software and it is formally defined as: “Development process which by strict methods evaluates and corrects the quality of the product along its lifecycle”. This process contains a number of methods, procedures and techniques formally defined which used systematically make easier the identification of the highest quantity of error and failures within a Software. A software going through this rigorous process of tests will become a quality product that will help the software engineer`s work while correcting incidences. Some of them probably generated after the deployment in a real environment. This process belongs to the Software engineering and therefore it is a specialization itself. Simplifying, the correct verification and validation of a software requires some essential activities such as: -Create a Test Plan of the project - Update this Test Plan and correct if necessary - Check Requirement’s specification documents -Execute the different tests among all the phases of the project - Create the pertinent documentation about design and execution of these tests. - Generate the result documents and all the possible incidences the tests could contain. Currently, the Test engineering is not recognized as a work area but an area immerse within the Software engineering. The professional environment includes the role of Test engineer, but only a few software engineers have clear to become Test engineers (testers) because they have never had the chance to face this activities within the university study centers where they take study of this degree. Since there are little professional environments, this area is focused from a theoretical way instead of a more practical vision. There are plenty of tools helping the Test engineer, but most of them are paid tools or bespoke tools for big companies in need of this software. Usually people know what test engineering is by starting working on it and not before, when people start acquiring experience in this field within a company. Therefore, the degree studied have not approach this field of the Software engineering before and in some cases the graduated students start working without any knowledge in this area. Because of this reasons explained, it is my intention to propose this Project: a methodology and a software tool supporting this methodology so the students of software engineering and similar ones but also graduated students with little experience in this area (Junior Test Engineers), can afford practice in this field and get used to the activities related with the test engineering. Because of this they will be able to carry out the proper tasks of this area easier, enforcing higher and better knowledge and experience of it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta tese apresenta o desenvolvimento e aplicação de modelos de turbulência, transição laminar-turbulenta e de interações fluido-estrutura ao escoamento externo em cilindro rígido estacionário e em vibrações induzidas por vórtices. Tais desenvolvimentos foram realizados no código ReFRESCO, baseado em técnicas de dinâmica de fluidos computacional (CFD). Realizou-se um estudo quanto ao desempenho do modelo k- SST em extensa faixa de números de Reynolds, segundo o qual se identificaram as deficiências de modelagem para este escoamento. A modelagem adaptativa das escalas (SAS) e o modelo de transição por correlações locais (LCTM), ambos combinados ao SST, melhoraram a aderência aos resultados experimentais para este escoamento, em uma contribuição original deste trabalho. A aplicação de técnicas de verificação e validação possibilitou a estimação de incertezas e erros para os modelos e números de Reynolds e também de identificada como outra contribuição deste trabalho. A combinação da modelagem em SST, SAS e LCTM com movimentos impostos de realizada para números de Reynolds moderados, diferentes frequências e amplitudes de vibração, algo que poucas publicações abordam em detalhes. Com relação aos movimentos livres, este trabalho traz contribuições com a aplicação dos modelos SST e SAS ao estudo de vibrações induzidas por vórtices em dois graus de liberdade, baixa razão de massa e números de Reynolds moderados, mais altos do que normalmente observados na literatura. Por fim, a investigação da importância relativa de efeitos da turbulência aos casos de movimentos livres e impostos, com relação ao caso de cilindro estacionário, comprovou a conjetura formulada na parte inicial deste trabalho, no que tange à escolha do modelo de turbulência em determinadas aplicações. Tal escolha mostrou-se menos decisiva no caso do cilindro em movimento imposto e ainda menos nos movimentos livres, em comparação ao caso estacionário, uma vez que a resposta em movimentos do corpo filtra grande parte dos efeitos turbulentos de ordem superior. Esta observação mostra-se relevante, uma vez que pode permitir simplificações na modelagem e aplicação de ferramentas de CFD em uma classe importante de projetos de engenharia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1/2-meter resolution 1:5,000 orthophoto image of the Boston region from April 2001. This datalayer is a subset (covering only the Boston region) of the Massachusetts statewide orthophoto image series available from MassGIS. It consists of 23 orthophoto quads mosaicked together (MassGIS orthophoto quad ID: 229890, 229894, 229898, 229902, 233886, 233890, 233894, 233898, 233902, 233906, 233910, 237890, 237894, 237898, 237902, 237906, 237910, 241890, 241894, 241898, 241902, 245898, 245902). These medium resolution true color images are considered the new "basemap" for the Commonwealth by MassGIS and the Executive Office of Environmental Affairs (EOEA). MassGIS/EOEA and the Massachusetts Highway Department jointly funded the project. The photography for the mainland was captured in April 2001 when deciduous trees were mostly bare and the ground was generally free of snow. The geographic extent of this dataset is the same as that of the MassGIS dataset: Boston, Massachusetts Region LIDAR First Return Elevation Data, 2002 [see cross references].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software Configuration Management is the discipline of managing large collections of software development artefacts from which software products are built. Software configuration management tools typically deal with artefacts at fine levels of granularity - such as individual source code files - and assist with coordination of changes to such artefacts. This paper describes a lightweight tool, designed to be used on top of a traditional file-based configuration management system. The add-on tool support enables users to flexibly define new hierarchical views of product structure, independent of the underlying artefact-repository structure. The tool extracts configuration and change data with respect to the user-defined hierarchy, leading to improved visibility of how individual subsystems have changed. The approach yields a range of new capabilities for build managers, and verification and validation teams. The paper includes a description of our experience using the tool in an organization that builds large embedded software systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A network of ship-mounted real-time Automatic Weather Stations integrated with Indian geosynchronous satellites Indian National Satellites (INSATs)] 3A and 3C, named Indian National Centre for Ocean Information Services Real-Time Automatic Weather Stations (I-RAWS), is established. The purpose of I-RAWS is to measure the surface meteorological-ocean parameters and transmit the data in real time in order to validate and refine the forcing parameters (obtained from different meteorological agencies) of the Indian Ocean Forecasting System (INDOFOS). Preliminary validation and intercomparison of analyzed products obtained from the National Centre for Medium Range Weather Forecasting and the European Centre for Medium-Range Weather Forecasts using the data collected from I-RAWS were carried out. This I-RAWS was mounted on board oceanographic research vessel Sagar Nidhi during a cruise across three oceanic regimes, namely, the tropical Indian Ocean, the extratropical Indian Ocean, and the Southern Ocean. The results obtained from such a validation and intercomparison, and its implications with special reference to the usage of atmospheric model data for forcing ocean model, are discussed in detail. It is noticed that the performance of analysis products from both atmospheric models is similar and good; however, European Centre for Medium-Range Weather Forecasts air temperature over the extratropical Indian Ocean and wind speed in the Southern Ocean are marginally better.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

University of Twente; Centre for Telematics and Information Technology; Netherlands Organisation for Scientific Research; Jacquard; Capgemini

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern power networks incorporate communications and information technology infrastructure into the electrical power system to create a smart grid in terms of control and operation. The smart grid enables real-time communication and control between consumers and utility companies allowing suppliers to optimize energy usage based on price preference and system technical issues. The smart grid design aims to provide overall power system monitoring, create protection and control strategies to maintain system performance, stability and security. This dissertation contributed to the development of a unique and novel smart grid test-bed laboratory with integrated monitoring, protection and control systems. This test-bed was used as a platform to test the smart grid operational ideas developed here. The implementation of this system in the real-time software creates an environment for studying, implementing and verifying novel control and protection schemes developed in this dissertation. Phasor measurement techniques were developed using the available Data Acquisition (DAQ) devices in order to monitor all points in the power system in real time. This provides a practical view of system parameter changes, system abnormal conditions and its stability and security information system. These developments provide valuable measurements for technical power system operators in the energy control centers. Phasor Measurement technology is an excellent solution for improving system planning, operation and energy trading in addition to enabling advanced applications in Wide Area Monitoring, Protection and Control (WAMPAC). Moreover, a virtual protection system was developed and implemented in the smart grid laboratory with integrated functionality for wide area applications. Experiments and procedures were developed in the system in order to detect the system abnormal conditions and apply proper remedies to heal the system. A design for DC microgrid was developed to integrate it to the AC system with appropriate control capability. This system represents realistic hybrid AC/DC microgrids connectivity to the AC side to study the use of such architecture in system operation to help remedy system abnormal conditions. In addition, this dissertation explored the challenges and feasibility of the implementation of real-time system analysis features in order to monitor the system security and stability measures. These indices are measured experimentally during the operation of the developed hybrid AC/DC microgrids. Furthermore, a real-time optimal power flow system was implemented to optimally manage the power sharing between AC generators and DC side resources. A study relating to real-time energy management algorithm in hybrid microgrids was performed to evaluate the effects of using energy storage resources and their use in mitigating heavy load impacts on system stability and operational security.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Body composition is affected by diseases, and affects responses to medical treatments, dosage of medicines, etc., while an abnormal body composition contributes to the causation of many chronic diseases. While we have reliable biochemical tests for certain nutritional parameters of body composition, such as iron or iodine status, and we have harnessed nuclear physics to estimate the body’s content of trace elements, the very basic quantification of body fat content and muscle mass remains highly problematic. Both body fat and muscle mass are vitally important, as they have opposing influences on chronic disease, but they have seldom been estimated as part of population health surveillance. Instead, most national surveys have merely reported BMI and waist, or sometimes the waist/hip ratio; these indices are convenient but do not have any specific biological meaning. Anthropometry offers a practical and inexpensive method for muscle and fat estimation in clinical and epidemiological settings; however, its use is imperfect due to many limitations, such as a shortage of reference data, misuse of terminology, unclear assumptions, and the absence of properly validated anthropometric equations. To date, anthropometric methods are not sensitive enough to detect muscle and fat loss. Aims: The aim of this thesis is to estimate Adipose/fat and muscle mass in health disease and during weight loss through; 1. evaluating and critiquing the literature, to identify the best-published prediction equations for adipose/fat and muscle mass estimation; 2. to derive and validate adipose tissue and muscle mass prediction equations; and 3.to evaluate the prediction equations along with anthropometric indices and the best equations retrieved from the literature in health, metabolic illness and during weight loss. Methods: a Systematic review using Cochrane Review method was used for reviewing muscle mass estimation papers that used MRI as the reference method. Fat mass estimation papers were critically reviewed. Mixed ethnic, age and body mass data that underwent whole body magnetic resonance imaging to quantify adipose tissue and muscle mass (dependent variable) and anthropometry (independent variable) were used in the derivation/validation analysis. Multiple regression and Bland-Altman plot were applied to evaluate the prediction equations. To determine how well the equations identify metabolic illness, English and Scottish health surveys were studied. Statistical analysis using multiple regression and binary logistic regression were applied to assess model fit and associations. Also, populations were divided into quintiles and relative risk was analysed. Finally, the prediction equations were evaluated by applying them to a pilot study of 10 subjects who underwent whole-body MRI, anthropometric measurements and muscle strength before and after weight loss to determine how well the equations identify adipose/fat mass and muscle mass change. Results: The estimation of fat mass has serious problems. Despite advances in technology and science, prediction equations for the estimation of fat mass depend on limited historical reference data and remain dependent upon assumptions that have not yet been properly validated for different population groups. Muscle mass does not have the same conceptual problems; however, its measurement is still problematic and reference data are scarce. The derivation and validation analysis in this thesis was satisfactory, compared to prediction equations in the literature they were similar or even better. Applying the prediction equations in metabolic illness and during weight loss presented an understanding on how well the equations identify metabolic illness showing significant associations with diabetes, hypertension, HbA1c and blood pressure. And moderate to high correlations with MRI-measured adipose tissue and muscle mass before and after weight loss. Conclusion: Adipose tissue mass and to an extent muscle mass can now be estimated for many purposes as population or groups means. However, these equations must not be used for assessing fatness and categorising individuals. Further exploration in different populations and health surveys would be valuable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern power networks incorporate communications and information technology infrastructure into the electrical power system to create a smart grid in terms of control and operation. The smart grid enables real-time communication and control between consumers and utility companies allowing suppliers to optimize energy usage based on price preference and system technical issues. The smart grid design aims to provide overall power system monitoring, create protection and control strategies to maintain system performance, stability and security. This dissertation contributed to the development of a unique and novel smart grid test-bed laboratory with integrated monitoring, protection and control systems. This test-bed was used as a platform to test the smart grid operational ideas developed here. The implementation of this system in the real-time software creates an environment for studying, implementing and verifying novel control and protection schemes developed in this dissertation. Phasor measurement techniques were developed using the available Data Acquisition (DAQ) devices in order to monitor all points in the power system in real time. This provides a practical view of system parameter changes, system abnormal conditions and its stability and security information system. These developments provide valuable measurements for technical power system operators in the energy control centers. Phasor Measurement technology is an excellent solution for improving system planning, operation and energy trading in addition to enabling advanced applications in Wide Area Monitoring, Protection and Control (WAMPAC). Moreover, a virtual protection system was developed and implemented in the smart grid laboratory with integrated functionality for wide area applications. Experiments and procedures were developed in the system in order to detect the system abnormal conditions and apply proper remedies to heal the system. A design for DC microgrid was developed to integrate it to the AC system with appropriate control capability. This system represents realistic hybrid AC/DC microgrids connectivity to the AC side to study the use of such architecture in system operation to help remedy system abnormal conditions. In addition, this dissertation explored the challenges and feasibility of the implementation of real-time system analysis features in order to monitor the system security and stability measures. These indices are measured experimentally during the operation of the developed hybrid AC/DC microgrids. Furthermore, a real-time optimal power flow system was implemented to optimally manage the power sharing between AC generators and DC side resources. A study relating to real-time energy management algorithm in hybrid microgrids was performed to evaluate the effects of using energy storage resources and their use in mitigating heavy load impacts on system stability and operational security.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The world’s population is ageing rapidly. Ageing has an impact on all aspects of human life, including social, economic, cultural, and political. Understanding ageing is therefore an important issue for the 21st century. This chapter will consider the active ageing model. This model is based on optimising opportunities for health, participation, and security in order to enhance quality of life. There is a range of exciting options developing for personal health management, for and by the ageing population, that make use of computer technology, and these should support active ageing. Their use depends however on older people learning to use computer technology effectively. The ability to use such technology will allow them to access relevant health information, advice, and support independently from wherever they live. Such support should increase rapidly in the future. This chapter is a consideration of ageing and learning, ageing and use of computer technology, and personal health management using computers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The proportion of older individuals in the driving population is predicted to increase in the next 50 years. This has important implications for driving safety as abilities which are important for safe driving, such as vision (which accounts for the majority of the sensory input required for driving), processing ability and cognition have been shown to decline with age. The current methods employed for screening older drivers upon re-licensure are also vision based. This study, which investigated social, behavioural and professional aspects involved with older drivers, aimed to determine: (i) if the current visual standards in place for testing upon re-licensure are effective in reducing the older driver fatality rate in Australia; (ii) if the recommended visual standards are actually implemented as part of the testing procedures by Australian optometrists; and (iii) if there are other non-standardised tests which may be better at predicting the on-road incident-risk (including near misses and minor incidents) in older drivers than those tests recommended in the standards. Methods: For the first phase of the study, state-based age- and gender-stratified numbers of older driver fatalities for 2000-2003 were obtained from the Australian Transportation Safety Bureau database. Poisson regression analyses of fatality rates were considered by renewal frequency and jurisdiction (as separate models), adjusting for possible confounding variables of age, gender and year. For the second phase, all practising optometrists in Australia were surveyed on the vision tests they conduct in consultations relating to driving and their knowledge of vision requirements for older drivers. Finally, for the third phase of the study to investigate determinants of on-road incident risk, a stratified random sample of 600 Brisbane residents aged 60 years and were selected and invited to participate using an introductory letter explaining the project requirements. In order to capture the number and type of road incidents which occurred for each participant over 12 months (including near misses and minor incidents), an important component of the prospective research study was the development and validation of a driving diary. The diary was a tool in which incidents that occurred could be logged at that time (or very close in time to which they occurred) and thus, in comparison with relying on participant memory over time, recall bias of incident occurrence was minimised. Association between all visual tests, cognition and scores obtained for non-standard functional tests with retrospective and prospective incident occurrence was investigated. Results: In the first phase,rivers aged 60-69 years had a 33% lower fatality risk (Rate Ratio [RR] = 0.75, 95% CI 0.32-1.77) in states with vision testing upon re-licensure compared with states with no vision testing upon re-licensure, however, because the CIs are wide, crossing 1.00, this result should be regarded with caution. However, overall fatality rates and fatality rates for those aged 70 years and older (RR=1.17, CI 0.64-2.13) did not differ between states with and without license renewal procedures, indicating no apparent benefit in vision testing legislation. For the second phase of the study, nearly all optometrists measured visual acuity (VA) as part of a vision assessment for re-licensing, however, 20% of optometrists did not perform any visual field (VF) testing and only 20% routinely performed automated VF on older drivers, despite the standards for licensing advocating automated VF as part of the vision standard. This demonstrates the need for more effective communication between the policy makers and those responsible for carrying out the standards. It may also indicate that the overall higher driver fatality rate in jurisdictions with vision testing requirements is resultant as the tests recommended by the standards are only partially being conducted by optometrists. Hence a standardised protocol for the screening of older drivers for re-licensure across the nation must be established. The opinions of Australian optometrists with regard to the responsibility of reporting older drivers who fail to meet the licensing standards highlighted the conflict between maintaining patient confidentiality or upholding public safety. Mandatory reporting requirements of those drivers who fail to reach the standards necessary for driving would minimise potential conflict between the patient and their practitioner, and help maintain patient trust and goodwill. The final phase of the PhD program investigated the efficacy of vision, functional and cognitive tests to discriminate between at-risk and safe older drivers. Nearly 80% of the participants experienced an incident of some form over the prospective 12 months, with the total incident rate being 4.65/10 000 km. Sixty-three percent reported having a near miss and 28% had a minor incident. The results from the prospective diary study indicate that the current vision screening tests (VA and VF) used for re-licensure do not accurately predict older drivers who are at increased odds of having an on-road incident. However, the variation in visual measurements of the cohort was narrow, also affecting the results seen with the visual functon questionnaires. Hence a larger cohort with greater variability should be considered for a future study. A slightly lower cognitive level (as measured with the Mini-Mental State Examination [MMSE]) did show an association with incident involvement as did slower reaction time (RT), however the Useful-Field-of-View (UFOV) provided the most compelling results of the study. Cut-off values of UFOV processing (>23.3ms), divided attention (>113ms), selective attention (>258ms) and overall score (moderate/ high/ very high risk) were effective in determining older drivers at increased odds of having any on-road incident and the occurrence of minor incidents. Discussion: The results have shown that for the 60-69 year age-group, there is a potential benefit in testing vision upon licence renewal. However, overall fatality rates and fatality rates for those aged 70 years and older indicated no benefit in vision testing legislation and suggests a need for inclusion of screening tests which better predict on-road incidents. Although VA is routinely performed by Australian optometrists on older drivers renewing their licence, VF is not. Therefore there is a need for a protocol to be developed and administered which would result in standardised methods conducted throughout the nation for the screening of older drivers upon re-licensure. Communication between the community, policy makers and those conducting the protocol should be maximised. By implementing a standardised screening protocol which incorporates a level of mandatory reporting by the practitioner, the ethical dilemma of breaching patient confidentiality would also be resolved. The tests which should be included in this screening protocol, however, cannot solely be ones which have been implemented in the past. In this investigation, RT, MMSE and UFOV were shown to be better determinants of on-road incidents in older drivers than VA and VF, however, as previously mentioned, there was a lack of variability in visual status within the cohort. Nevertheless, it is the recommendation from this investigation, that subject to appropriate sensitivity and specificity being demonstrated in the future using a cohort with wider variation in vision, functional performance and cognition, these tests of cognition and information processing should be added to the current protocol for the screening of older drivers which may be conducted at licensing centres across the nation.