993 resultados para Validation Tools
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
The Final Year Project consists of two essentially different parts, which share acommon theme: HTML code validation. The first of these two parts focuses on the study of the validation process. It supplies a brief introduction to the evolution of HTML and XHTML, the new tags introduced in HTML5 and the most common errors found in today's websites. Already developed HTML validation tools are analyzed and examined in detail in order to compare their features and evaluate their performances. Lastly, a comparison of the parsing process in the most common browsers found nowadays is provided. In the second part of the project the focus of the project is shifted towards the development of a XHTML5 validation tool. The input is a XHTML5 file whose content may or may not comply with the W3C specification, and therefore, may or may not be a valid XHTML5 document. The output provided by this tool will be a fixed XHTML5 document and an error log returned in the form of a XML file. Information as to the course of action pursued to fix the error and its location will also be included.
Resumo:
Architectural decisions can be interpreted as structural and behavioral constraints that must be enforced in order to guarantee overarching qualities in a system. Enforcing those constraints in a fully automated way is often challenging and not well supported by current tools. Current approaches for checking architecture conformance either lack in usability or offer poor options for adaptation. To overcome this problem we analyze the current state of practice and propose an approach based on an extensible, declarative and empirically-grounded specification language. This solution aims at reducing the overall cost of setting up and maintaining an architectural conformance monitoring environment by decoupling the conceptual representation of a user-defined rule from its technical specification prescribed by the underlying analysis tools. By using a declarative language, we are able to write tool-agnostic rules that are simple enough to be understood by untrained stakeholders and, at the same time, can be can be automatically processed by a conformance checking validator. Besides addressing the issue of cost, we also investigate opportunities for increasing the value of conformance checking results by assisting the user towards the full alignment of the implementation with respect to its architecture. In particular, we show the benefits of providing actionable results by introducing a technique which automatically selects the optimal repairing solutions by means of simulation and profit-based quantification. We perform various case studies to show how our approach can be successfully adopted to support truly diverse industrial projects. We also investigate the dynamics involved in choosing and adopting a new automated conformance checking solution within an industrial context. Our approach reduces the cost of conformance checking by avoiding the need for an explicit management of the involved validation tools. The user can define rules using a convenient high-level DSL which automatically adapts to emerging analysis requirements. Increased usability and modular customization ensure lower costs and a shorter feedback loop.
Resumo:
In recent years, challenged by the climate scenarios put forward by the IPCC and its potential impact on plant distribution, numerous predictive techniques -including the so called habitat suitability models (HSM)- have been developed. Yet, as the output of the different methods produces different distribution areas, developing validation tools are strong needs to reduce uncertainties. Focused in the Iberian Peninsula, we propose a palaeo-based method to increase the robustness of the HSM, by developing an ecological approach to understand the mismatches between the palaeoecological information and the projections of the HSMs. Here, we present the result of (1) investigating causal relationships between environmental variables and presence of Pinus sylvestris L. and P. nigra Arn. available from the 3rd Spanish Forest Inventory, (2) developing present and past presence-predictions through the MaxEnt model for 6 and 21 kyr BP, and (3) assessing these models through comparisons with biomized palaeoecological data available from the European Pollen Database for the Iberian Peninsula.
Resumo:
Los sistemas empotrados son cada día más comunes y complejos, de modo que encontrar procesos seguros, eficaces y baratos de desarrollo software dirigidos específicamente a esta clase de sistemas es más necesario que nunca. A diferencia de lo que ocurría hasta hace poco, en la actualidad los avances tecnológicos en el campo de los microprocesadores de los últimos tiempos permiten el desarrollo de equipos con prestaciones más que suficientes para ejecutar varios sistemas software en una única máquina. Además, hay sistemas empotrados con requisitos de seguridad (safety) de cuyo correcto funcionamiento depende la vida de muchas personas y/o grandes inversiones económicas. Estos sistemas software se diseñan e implementan de acuerdo con unos estándares de desarrollo software muy estrictos y exigentes. En algunos casos puede ser necesaria también la certificación del software. Para estos casos, los sistemas con criticidades mixtas pueden ser una alternativa muy valiosa. En esta clase de sistemas, aplicaciones con diferentes niveles de criticidad se ejecutan en el mismo computador. Sin embargo, a menudo es necesario certificar el sistema entero con el nivel de criticidad de la aplicación más crítica, lo que hace que los costes se disparen. La virtualización se ha postulado como una tecnología muy interesante para contener esos costes. Esta tecnología permite que un conjunto de máquinas virtuales o particiones ejecuten las aplicaciones con unos niveles de aislamiento tanto temporal como espacial muy altos. Esto, a su vez, permite que cada partición pueda ser certificada independientemente. Para el desarrollo de sistemas particionados con criticidades mixtas se necesita actualizar los modelos de desarrollo software tradicionales, pues estos no cubren ni las nuevas actividades ni los nuevos roles que se requieren en el desarrollo de estos sistemas. Por ejemplo, el integrador del sistema debe definir las particiones o el desarrollador de aplicaciones debe tener en cuenta las características de la partición donde su aplicación va a ejecutar. Tradicionalmente, en el desarrollo de sistemas empotrados, el modelo en V ha tenido una especial relevancia. Por ello, este modelo ha sido adaptado para tener en cuenta escenarios tales como el desarrollo en paralelo de aplicaciones o la incorporación de una nueva partición a un sistema ya existente. El objetivo de esta tesis doctoral es mejorar la tecnología actual de desarrollo de sistemas particionados con criticidades mixtas. Para ello, se ha diseñado e implementado un entorno dirigido específicamente a facilitar y mejorar los procesos de desarrollo de esta clase de sistemas. En concreto, se ha creado un algoritmo que genera el particionado del sistema automáticamente. En el entorno de desarrollo propuesto, se han integrado todas las actividades necesarias para desarrollo de un sistema particionado, incluidos los nuevos roles y actividades mencionados anteriormente. Además, el diseño del entorno de desarrollo se ha basado en la ingeniería guiada por modelos (Model-Driven Engineering), la cual promueve el uso de los modelos como elementos fundamentales en el proceso de desarrollo. Así pues, se proporcionan las herramientas necesarias para modelar y particionar el sistema, así como para validar los resultados y generar los artefactos necesarios para el compilado, construcción y despliegue del mismo. Además, en el diseño del entorno de desarrollo, la extensión e integración del mismo con herramientas de validación ha sido un factor clave. En concreto, se pueden incorporar al entorno de desarrollo nuevos requisitos no-funcionales, la generación de nuevos artefactos tales como documentación o diferentes lenguajes de programación, etc. Una parte clave del entorno de desarrollo es el algoritmo de particionado. Este algoritmo se ha diseñado para ser independiente de los requisitos de las aplicaciones así como para permitir al integrador del sistema implementar nuevos requisitos del sistema. Para lograr esta independencia, se han definido las restricciones al particionado. El algoritmo garantiza que dichas restricciones se cumplirán en el sistema particionado que resulte de su ejecución. Las restricciones al particionado se han diseñado con una capacidad expresiva suficiente para que, con un pequeño grupo de ellas, se puedan expresar la mayor parte de los requisitos no-funcionales más comunes. Las restricciones pueden ser definidas manualmente por el integrador del sistema o bien pueden ser generadas automáticamente por una herramienta a partir de los requisitos funcionales y no-funcionales de una aplicación. El algoritmo de particionado toma como entradas los modelos y las restricciones al particionado del sistema. Tras la ejecución y como resultado, se genera un modelo de despliegue en el que se definen las particiones que son necesarias para el particionado del sistema. A su vez, cada partición define qué aplicaciones deben ejecutar en ella así como los recursos que necesita la partición para ejecutar correctamente. El problema del particionado y las restricciones al particionado se modelan matemáticamente a través de grafos coloreados. En dichos grafos, un coloreado propio de los vértices representa un particionado del sistema correcto. El algoritmo se ha diseñado también para que, si es necesario, sea posible obtener particionados alternativos al inicialmente propuesto. El entorno de desarrollo, incluyendo el algoritmo de particionado, se ha probado con éxito en dos casos de uso industriales: el satélite UPMSat-2 y un demostrador del sistema de control de una turbina eólica. Además, el algoritmo se ha validado mediante la ejecución de numerosos escenarios sintéticos, incluyendo algunos muy complejos, de más de 500 aplicaciones. ABSTRACT The importance of embedded software is growing as it is required for a large number of systems. Devising cheap, efficient and reliable development processes for embedded systems is thus a notable challenge nowadays. Computer processing power is continuously increasing, and as a result, it is currently possible to integrate complex systems in a single processor, which was not feasible a few years ago.Embedded systems may have safety critical requirements. Its failure may result in personal or substantial economical loss. The development of these systems requires stringent development processes that are usually defined by suitable standards. In some cases their certification is also necessary. This scenario fosters the use of mixed-criticality systems in which applications of different criticality levels must coexist in a single system. In these cases, it is usually necessary to certify the whole system, including non-critical applications, which is costly. Virtualization emerges as an enabling technology used for dealing with this problem. The system is structured as a set of partitions, or virtual machines, that can be executed with temporal and spatial isolation. In this way, applications can be developed and certified independently. The development of MCPS (Mixed-Criticality Partitioned Systems) requires additional roles and activities that traditional systems do not require. The system integrator has to define system partitions. Application development has to consider the characteristics of the partition to which it is allocated. In addition, traditional software process models have to be adapted to this scenario. The V-model is commonly used in embedded systems development. It can be adapted to the development of MCPS by enabling the parallel development of applications or adding an additional partition to an existing system. The objective of this PhD is to improve the available technology for MCPS development by providing a framework tailored to the development of this type of system and by defining a flexible and efficient algorithm for automatically generating system partitionings. The goal of the framework is to integrate all the activities required for developing MCPS and to support the different roles involved in this process. The framework is based on MDE (Model-Driven Engineering), which emphasizes the use of models in the development process. The framework provides basic means for modeling the system, generating system partitions, validating the system and generating final artifacts. The framework has been designed to facilitate its extension and the integration of external validation tools. In particular, it can be extended by adding support for additional non-functional requirements and support for final artifacts, such as new programming languages or additional documentation. The framework includes a novel partitioning algorithm. It has been designed to be independent of the types of applications requirements and also to enable the system integrator to tailor the partitioning to the specific requirements of a system. This independence is achieved by defining partitioning constraints that must be met by the resulting partitioning. They have sufficient expressive capacity to state the most common constraints and can be defined manually by the system integrator or generated automatically based on functional and non-functional requirements of the applications. The partitioning algorithm uses system models and partitioning constraints as its inputs. It generates a deployment model that is composed by a set of partitions. Each partition is in turn composed of a set of allocated applications and assigned resources. The partitioning problem, including applications and constraints, is modeled as a colored graph. A valid partitioning is a proper vertex coloring. A specially designed algorithm generates this coloring and is able to provide alternative partitions if required. The framework, including the partitioning algorithm, has been successfully used in the development of two industrial use cases: the UPMSat-2 satellite and the control system of a wind-power turbine. The partitioning algorithm has been successfully validated by using a large number of synthetic loads, including complex scenarios with more that 500 applications.
Resumo:
Software architecture erodes over time and needs to be constantly monitored to be kept consistent with its original intended design. Consistency is rarely monitored using automated techniques. The cost associated to such an activity is typically not considered proportional to its benefits. To improve this situation, we propose Dicto, a uniform DSL for specifying architectural invariants. This language is designed to reduce the cost of consistency checking by offering a framework in which existing validation tools can be matched to newly-defined language constructs. In this paper we discuss how such a DSL can be qualitatively and qualitatively evaluated in practice.
Resumo:
Wydział Biologii: Instytut Biologii Molekularnej i Biotechnologii
Resumo:
Introdução: A validação de instrumentos é essencial na investigação epidemiológica, especialmente para a definição consensual de caso e comparação de resultados. Atualmente, o instrumento mais utilizado para identificar a dispepsia funcional é o questionário Roma III, o qual não se encontra validado para a população portuguesa. Objetivos: Validar o questionário Roma III para dispepsia funcional em adultos Portugueses. Métodos: O questionário foi traduzido seguindo as recomendações de Roma III. Um total de 166 indivíduos responderam ao questionário. A identificação da categoria dispepsia funcional em adultos baseou-se em um ou mais sintomas dos 4 sintomas que a escala permite avaliar através de 6 itens. A consistência interna, reprodutibilidade e análise de conteúdo foram avaliados com recurso ao SPSS 23.0. Resultados: O coeficiente de alfa de Cronbach no total dos 18 itens avaliados foi de 0.89. Para a categoria dispepsia funcional (avaliada através de 6 itens) foi de 0.76 e o alfa de Cronbach com base em itens padronizados foi de 0.85. Conclusões: validamos, para Portugal, o Questionário Roma III para o diagnóstico de doenças gastrointestinais funcionais, designadamente para a categoria dispepsia funcional em adultos. Estes resultados sugerem que este instrumento será útil para a investigação na população Portuguesa.
Resumo:
Resumo:
BACKGROUND The distribution of the enzymopathy glucose-6-phosphate dehydrogenase (G6PD) deficiency is linked to areas of high malaria endemicity due to its association with protection from disease. G6PD deficiency is also identified as the cause of severe haemolysis following administration of the anti-malarial drug primaquine and further use of this drug will likely require identification of G6PD deficiency on a population level. Current conventional methods for G6PD screening have various disadvantages for field use. METHODS The WST8/1-methoxy PMS method, recently adapted for field use, was validated using a gold standard enzymatic assay (R&D Diagnostics Ltd ®) in a study involving 235 children under five years of age, who were recruited by random selection from a cohort study in Tororo, Uganda. Blood spots were collected by finger-prick onto filter paper at routine visits, and G6PD activity was determined by both tests. Performance of the WST8/1-methoxy PMS test under various temperature, light, and storage conditions was evaluated. RESULTS The WST8/1-methoxy PMS assay was found to have 72% sensitivity and 98% specificity when compared to the commercial enzymatic assay and the AUC was 0.904, suggesting good agreement. Misclassifications were at borderline values of G6PD activity between mild and normal levels, or related to outlier haemoglobin values (<8.0 gHb/dl or >14 gHb/dl) associated with ongoing anaemia or recent haemolytic crises. Although severe G6PD deficiency was not found in the area, the test enabled identification of low G6PD activity. The assay was found to be highly robust for field use; showing less light sensitivity, good performance over a wide temperature range, and good capacity for medium-to-long term storage. CONCLUSIONS The WST8/1-methoxy PMS assay was comparable to the currently used standard enzymatic test, and offers advantages in terms of cost, storage, portability and use in resource-limited settings. Such features make this test a potential key tool for deployment in the field for point of care assessment prior to primaquine administration in malaria-endemic areas. As with other G6PD tests, outlier haemoglobin levels may confound G6PD level estimation.
Resumo:
Nervous system disorders are associated with cognitive and motor deficits, and are responsible for the highest disability rates and global burden of disease. Their recovery paths are vulnerable and dependent on the effective combination of plastic brain tissue properties, with complex, lengthy and expensive neurorehabilitation programs. This work explores two lines of research, envisioning sustainable solutions to improve treatment of cognitive and motor deficits. Both projects were developed in parallel and shared a new sensible approach, where low-cost technologies were integrated with common clinical operative procedures. The aim was to achieve more intensive treatments under specialized monitoring, improve clinical decision-making and increase access to healthcare. The first project (articles I – III) concerned the development and evaluation of a web-based cognitive training platform (COGWEB), suitable for intensive use, either at home or at institutions, and across a wide spectrum of ages and diseases that impair cognitive functioning. It was tested for usability in a memory clinic setting and implemented in a collaborative network, comprising 41 centers and 60 professionals. An adherence and intensity study revealed a compliance of 82.8% at six months and an average of six hours/week of continued online cognitive training activities. The second project (articles IV – VI) was designed to create and validate an intelligent rehabilitation device to administer proprioceptive stimuli on the hemiparetic side of stroke patients while performing ambulatory movement characterization (SWORD). Targeted vibratory stimulation was found to be well tolerated and an automatic motor characterization system retrieved results comparable to the first items of the Wolf Motor Function Test. The global system was tested in a randomized placebo controlled trial to assess its impact on a common motor rehabilitation task in a relevant clinical environment (early post-stroke). The number of correct movements on a hand-to-mouth task was increased by an average of 7.2/minute while the probability to perform an error decreased from 1:3 to 1:9. Neurorehabilitation and neuroplasticity are shifting to more neuroscience driven approaches. Simultaneously, their final utility for patients and society is largely dependent on the development of more effective technologies that facilitate the dissemination of knowledge produced during the process. The results attained through this work represent a step forward in that direction. Their impact on the quality of rehabilitation services and public health is discussed according to clinical, technological and organizational perspectives. Such a process of thinking and oriented speculation has led to the debate of subsequent hypotheses, already being explored in novel research paths.
Resumo:
The porpoise of this study was to implement research methodologies and assess the effectiveness and impact of management tools to promote best practices for the long term conservation of the endangered African wild dog (Lycaon pictus). Different methods were included in the project framework to investigate and expand the applicability of these methodologies to free-ranging African wild dogs in the southern African region: ethology, behavioural endocrinology and ecology field methodologies were tested and implemented. Additionally, research was performed to test the effectiveness and implication of a contraceptive implant (Suprenolin) as a management tool for the species of a subpopulation hosted in fenced areas. Attention was especially given to social structure and survival of treated packs. This research provides useful tools and advances the applicability of these methods for field studies, standardizing and improving research instruments in the field of conservation biology and behavioural endocrinology. Results reported here provide effective methodologies to expand the applicability of non-invasive endocrine assessment to previously prohibited fields, and validation of sampling methods for faecal hormone analysis. The final aim was to fill a knowledge gap on behaviours of the species and provide a common ground for future researchers to apply non-invasive methods to this species research and to test the effectiveness of the contraception on a managed metapopulation.
Resumo:
High-throughput screening of physical, genetic and chemical-genetic interactions brings important perspectives in the Systems Biology field, as the analysis of these interactions provides new insights into protein/gene function, cellular metabolic variations and the validation of therapeutic targets and drug design. However, such analysis depends on a pipeline connecting different tools that can automatically integrate data from diverse sources and result in a more comprehensive dataset that can be properly interpreted. We describe here the Integrated Interactome System (IIS), an integrative platform with a web-based interface for the annotation, analysis and visualization of the interaction profiles of proteins/genes, metabolites and drugs of interest. IIS works in four connected modules: (i) Submission module, which receives raw data derived from Sanger sequencing (e.g. two-hybrid system); (ii) Search module, which enables the user to search for the processed reads to be assembled into contigs/singlets, or for lists of proteins/genes, metabolites and drugs of interest, and add them to the project; (iii) Annotation module, which assigns annotations from several databases for the contigs/singlets or lists of proteins/genes, generating tables with automatic annotation that can be manually curated; and (iv) Interactome module, which maps the contigs/singlets or the uploaded lists to entries in our integrated database, building networks that gather novel identified interactions, protein and metabolite expression/concentration levels, subcellular localization and computed topological metrics, GO biological processes and KEGG pathways enrichment. This module generates a XGMML file that can be imported into Cytoscape or be visualized directly on the web. We have developed IIS by the integration of diverse databases following the need of appropriate tools for a systematic analysis of physical, genetic and chemical-genetic interactions. IIS was validated with yeast two-hybrid, proteomics and metabolomics datasets, but it is also extendable to other datasets. IIS is freely available online at: http://www.lge.ibi.unicamp.br/lnbio/IIS/.
Resumo:
Appropriate pain assessment is very important for managing chronic pain. Given the cultural differences in verbally expressing pain and in psychosocial problems, specific tools are needed. The goal of this study was to identify and validate Brazilian pain descriptors. A purposive sample of health professionals and chronic pain patients was recruited. Four studies were conducted using direct and indirect psychophysical methods: category estimation, magnitude estimation, and magnitude estimation and tine-length. Results showed the descriptors which best describe chronic pain in Brazilian culture and demonstrated that there is not a significant correlation between patients and health professionals and that the psychophysical scale of judgment of pain descriptors is valid, stable, and consistent. Results reinforced that the translations of word descriptors and research tools into another language may be inappropriate, owing to differences in perception and communication and the inadequacy of exact translations to reflect the intended meaning. Given the complexity of the chronic pain, personal suffering involved, and the need for accurate assessment of chronic pain using descriptors stemming from Brazilian culture and language, it is essential to investigate the most adequate words to describe chronic pain. Although it requires more refinement, the Brazilian chronic pain descriptors can be used further to develop a multidimensional pain assessment tool that is culturally sensitive. (C) 2009 by the American Society for Pain Management Nursing
Resumo:
The activity of validating identified requirements for an information system helps to improve the quality of a requirements specification document and, consequently, the success of a project. Although various different support tools to requirements engineering exist in the market, there is still a lack of automated support for validation activity. In this context, the purpose of this paper is to make up for that deficiency, with the use of an automated tool, to provide the resources for the execution of an adequate validation activity. The contribution of this study is to enable an agile and effective follow-up of the scope established for the requirements, so as to lead the development to a solution which would satisfy the real necessities of the users, as well as to supply project managers with relevant information about the maturity of the analysts involved in requirements specification.