942 resultados para software, translation, validation tool, VMNET, Wikipedia, XML
Resumo:
This paper presents the experimental three-year learning activity developed by a group of teachers in a wind tunnel facility. The authors, leading a team of students, carried out a project consisting of the design, assembly and testing of a wind tunnel. The project included all stages of the process from its initial specifications to its final quality flow assessments, going through the calculation of each element, and the building of the whole wind tunnel. The group of (final year) students was responsible for the whole wind tunnel project as a part of their bachelor degree project. The paper focuses on the development of wind tunnel data acquisition software. This automatic tool is essential to improve the automation of the data acquisition of the wind tunnel facility systems, in particular for a 6DOF multi-axis force/torque sensor. This work can be considered as a typical example of real engineering practice: a set of specifications that has to be modified due to the constraints imposed throughout the project, in order to obtain the final result
Resumo:
The verification and validation activity plays a fundamental role in improving software quality. Determining which the most effective techniques for carrying out this activity are has been an aspiration of experimental software engineering researchers for years. This paper reports a controlled experiment evaluating the effectiveness of two unit testing techniques (the functional testing technique known as equivalence partitioning (EP) and the control-flow structural testing technique known as branch testing (BT)). This experiment is a literal replication of Juristo et al. (2013). Both experiments serve the purpose of determining whether the effectiveness of BT and EP varies depending on whether or not the faults are visible for the technique (InScope or OutScope, respectively). We have used the materials, design and procedures of the original experiment, but in order to adapt the experiment to the context we have: (1) reduced the number of studied techniques from 3 to 2; (2) assigned subjects to experimental groups by means of stratified randomization to balance the influence of programming experience; (3) localized the experimental materials and (4) adapted the training duration. We ran the replication at the Escuela Polite?cnica del Eje?rcito Sede Latacunga (ESPEL) as part of a software verification & validation course. The experimental subjects were 23 master?s degree students. EP is more effective than BT at detecting InScope faults. The session/program and group variables are found to have significant effects. BT is more effective than EP at detecting OutScope faults. The session/program and group variables have no effect in this case. The results of the replication and the original experiment are similar with respect to testing techniques. There are some inconsistencies with respect to the group factor. They can be explained by small sample effects. The results for the session/program factor are inconsistent for InScope faults. We believe that these differences are due to a combination of the fatigue effect and a technique x program interaction. Although we were able to reproduce the main effects, the changes to the design of the original experiment make it impossible to identify the causes of the discrepancies for sure. We believe that further replications closely resembling the original experiment should be conducted to improve our understanding of the phenomena under study.
Resumo:
La Ingeniería Biomédica surgió en la década de 1950 como una fascinante mezcla interdisciplinaria, en la cual la ingeniería, la biología y la medicina aunaban esfuerzos para analizar y comprender distintas enfermedades. Las señales existentes en este área deben ser analizadas e interpretadas, más allá de las capacidades limitadas de la simple vista y la experiencia humana. Aquí es donde el procesamiento digital de la señal se postula como una herramienta indispensable para extraer la información relevante oculta en dichas señales. La electrocardiografía fue una de las primeras áreas en las que se aplicó el procesado digital de señales hace más de 50 años. Las señales electrocardiográficas continúan siendo, a día de hoy, objeto de estudio por parte de cardiólogos e ingenieros. En esta área, las técnicas de procesamiento de señal han ayudado a encontrar información oculta a simple vista que ha cambiado la forma de tratar ciertas enfermedades que fueron ya diagnosticadas previamente. Desde entonces, se han desarrollado numerosas técnicas de procesado de señales electrocardiográficas, pudiéndose resumir estas en tres grandes categorías: análisis tiempo-frecuencia, análisis de organización espacio-temporal y separación de la actividad atrial del ruido y las interferencias. Este proyecto se enmarca dentro de la primera categoría, análisis tiempo-frecuencia, y en concreto dentro de lo que se conoce como análisis de frecuencia dominante, la cual se va a aplicar al análisis de señales de fibrilación auricular. El proyecto incluye una parte teórica de análisis y desarrollo de algoritmos de procesado de señal, y una parte práctica, de programación y simulación con Matlab. Matlab es una de las herramientas fundamentales para el procesamiento digital de señales por ordenador, la cual presenta importantes funciones y utilidades para el desarrollo de proyectos en este campo. Por ello, se ha elegido dicho software como herramienta para la implementación del proyecto. ABSTRACT. Biomedical Engineering emerged in the 1950s as a fascinating interdisciplinary blend, in which engineering, biology and medicine pooled efforts to analyze and understand different diseases. Existing signals in this area should be analyzed and interpreted, beyond the limited capabilities of the naked eye and the human experience. This is where the digital signal processing is postulated as an indispensable tool to extract the relevant information hidden in these signals. Electrocardiography was one of the first areas where digital signal processing was applied over 50 years ago. Electrocardiographic signals remain, even today, the subject of close study by cardiologists and engineers. In this area, signal processing techniques have helped to find hidden information that has changed the way of treating certain diseases that were already previously diagnosed. Since then, numerous techniques have been developed for processing electrocardiographic signals. These methods can be summarized into three categories: time-frequency analysis, analysis of spatio-temporal organization and separation of atrial activity from noise and interferences. This project belongs to the first category, time-frequency analysis, and specifically to what is known as dominant frequency analysis, which is one of the fundamental tools applied in the analysis of atrial fibrillation signals. The project includes a theoretical part, related to the analysis and development of signal processing algorithms, and a practical part, related to programming and simulation using Matlab. Matlab is one of the fundamental tools for digital signal processing, presenting significant functions and advantages for the development of projects in this field. Therefore, we have chosen this software as a tool for project implementation.
Resumo:
El presente trabajo, trata del ahorro de energía en la edificación y en el urbanismo. Las premisas en este caso, son un contexto normativo europeo y nacional muy exigentes y encaminados de manera decidida hacia edificios cada vez más eficientes y económicos. Se centra el estudio en las decisiones iniciales que se adoptan sobre las condiciones de ocupación de la parcela urbana, las tipologías edificatorias más adecuadas, su morfología y escala y las consecuencias que tienen para el comportamiento energético final, tanto en términos objetivos como normativos. Se trata de cuantificar que suponen estas decisiones en términos de ahorro energético. Todo el análisis se realiza para un contexto climático concreto, el de la ciudad de Madrid. Para los análisis de las diferentes condiciones de implantación objeto del estudio, se han empleado unas herramientas informáticas singulares. Se trata de los programas de evaluación de la demanda y certificación energética de edificios, que el gobierno español pone a disposición de los usuarios de manera gratuita. Estas, son aplicaciones pensadas para la escala del edificio y/o parte de él, pero que con la metodología y simplificaciones que en el trabajo se detallan, pueden ser empleadas en la escala media de intervención urbana, tanto en nueva implantación como en rehabilitación. Hay que tener en cuenta que son estas aplicaciones las que se utilizarán en la mayoría de los casos como instrumento de evaluación y calificación del comportamiento energético de cada una de las unidades. Las tipologías objeto del estudio son: - Vivienda unifamiliar: aislada, pareada y adosada en hilera. - Bloque abierto - Bloque en H - Bloque en cruz - Torre - Manzana cerrada El contenido principal del trabajo se centra en el análisis individual de cada tipología y de su agrupación teórica sobre lo que podíamos llamar "unidad urbana", una manzana tipo de 10.000 m2, 1 ha. Se opta por esta unidad por tratarse de una superficie urbana lo suficientemente amplia para caracterizar la agrupación de las diferentes tipologías estudiadas y por adaptarse a las capacidades de las herramientas informáticas que se han utilizado. Se han analizado diferentes opciones tipológicas de ocupación, manteniendo constantes en todas las soluciones estudiadas, los siguientes parámetros: • el clima (Madrid), • la edificabilidad (en todas menos una en la que el modelo no permite alcanzar la edificabilidad de referencia), • la pureza formal del modelo, evitando los juegos compositivos de retranqueos y salientes de la envolvente que distorsionen el comportamiento de la volumetría primaria, • las soluciones constructivas de la envolvente y particiones interiores de los edificios, • las proporciones de huecos en la envolvente, • las soluciones de sus carpinterías y vidrios • y todas las condiciones operacionales que aplica el programa de simulación. Esta tesis, es un estudio analítico y evaluado, del comportamiento de cada uno de los tipos, su forma y posicionamiento en el espacio. Cada uno de los modelos se simula de manera individual y agrupados, con el fin de conseguir colmatar la edificabilidad de referencia sobre la parcela urbana de 1 ha. Todos los resultados se estudian de forma independiente y los resultados se expresan en diferentes tablas y un resumen en fichas individuales por tipos. La conclusión principal del trabajo es que la tipología elegida como contenedor residencial urbano determina en su elección acertada la primera medida de ahorro energético y reducción de emisiones cuantificables en más del 50% entre las tipologías más favorables y las más desfavorables. Una segunda parte del trabajo de investigación, consiste en la aplicación de esta metodología de simulación y empleando las mismas herramientas, en el estudio de casos reales en la comunidad de Madrid (principalmente en la ciudad de Madrid). El objetivo es validar el procedimiento y dichas herramientas, también para el caso de evaluación de tejidos urbanos consolidados. Como caso singular de estudio de rehabilitación urbana, se analizan las intervenciones de rehabilitación partiendo de criterios acústicos y las oportunidades que plantearía la inclusión de criterios térmicos aprovechando la sinergia entre ambas demandas, la de confort acústico y térmico. ABSTRACT This PhD work is about saving energy in buildings and urban planning. The premises in this case are a very demanding European and national policy, aimed decisively towards efficient and economic buildings. The study focuses on the initial decisions taken on the conditions of occupation of urban land, the most suitable building types, their morphology and scale and the implications for the final energy performance, always considering policy objectives. This essay tries to quantify how important this decisions are in energy savings terms. All analysis are performed for a particular climatic context, the city of Madrid. For the analysis of different implantation conditions under study, we have used a unique software tool. This software, a free tool available online, quantifies demand assessment and energy certification of buildings. It is designed for building scale and / or part of it. With the methodology and simplifications detailed in this paper, the software can be used in medium scale urban intervention. There are different types under study such as, isolated house, semi-detached, terraces row, open block, h block, cross block, tower, etc. The main content of the work focuses on the individual analysis of each type and its theoretical group, named urban unit group. This unit is chosen because it is an urban area large enough to characterize the grouping of the different types studied. It is also possible to simulate with the software tools. Different options of typological occupation have been analyzed taking in consideration the next parameters: climate, floor area, model formal purity, building envelope solutions and interior partitions of buildings, the proportions of voids in the facades. This thesis is an analytical and evaluated study of the behavior of each types, form and position in space. Each of the models is simulated individually and grouped, in order to get the reference buildable urban plot of 1 ha. All results are studied independently and the results are expressed in different tables and a summary in individual files by type. The main conclusion of the study is that the type chosen as urban residential container you choose determines the first step in successful energy savings and quantifiable reduction of emissions by more than 50% in the most favorable and the most unfavorable types. A second part of the research, is the application of this methodology and simulation using the same tools in the study of real cases in the community of Madrid (mainly in the city of Madrid). The aim is to validate the procedure and such tools, also for the case of evaluation of consolidated urban fabric. As a unique case study of urban renewal, rehabilitation interventions based on acoustic criteria and opportunities arise thermal criteria including leveraging the synergy between the two demands, acoustic and thermal comfort are analyzed.
Resumo:
This paper presents an online C compiler designed so that students can program their practical assignments in Programming courses. What is really innovative is the self-assessment of the exercises based on black-box tests and train students’ skill to test software. Moreover, this tool lets instructors, not only proposing and classifying practical exercises, but also evaluating automatically the efforts dedicated and the results obtained by the students. The system has been applied to the 1st-year students at the Industrial Engineering specialization at the Universidad Politecnica de Madrid. Results show that the students obtained better academic performance, reducing the failure rate in the practical exam considerably with respect to previous years, in addition that an anonymous survey proved that students are satisfied with the system because they get instant feedback about their programs.
Resumo:
O trabalho proposto é a montar um projeto exploratório funcional que parta da cartografia cognitiva, que é definida como a arte, teoria e técnica de construir mapas do conhecimento, visando com esta cartografia a sua aplicabilidade na estruturação de um conjunto de cursos/saberes da área da Comunicação. Assemelhado ao conceito de mapas conceituais, ou cognitivos, representam o conhecimento organizado e são compostos por conceitos. Este mapas foram desenvolvido a partir da década de setenta por vários pesquisadores. A tese experimento foi montada tendo como base os softwares de relacionamento por temas e interesses dentro de um ambiente de interatividade tridimensional , montado no conceito de arvore do conhecimento relacional. Esta experiência é construída em ambiente tridimensional com uso de softwares 3D que rodam como aplicativos de engines de vídeo games, que são motores gráficos. A base de dados e a interatividade de textos e tarefas é realizada sob a plataforma do MediaWiki, que é o software aberto que roda a Wikipédia. A plataforma de mapas roda dentro de um software MindJet MindManager e do CMAPS. As vídeo conferências são administradas pelo FlashMeeting, de Web conferência. A maioria deles são softwares abertos. Todos operando em sistemas presenciais ou de modulação EAD. A base conceitual está estruturada dentro de uma visão de educação disruptiva, que lança um novo modelo educacional baseado em mapas, visto dentro de uma abordagem de um mundo de múltiplas telas, um mundo da era hiper, um mundo hipermoderno, que tem como base uma cultura da era tecnológica, numa renovação dos conceitos de Cultura, agora revigorados à luz das novas tecnologias e da nova sociedade interligada em rede.
Resumo:
This work considered the micro-mechanical behavior of a long fiber embedded in an infinite matrix. Using the theory of elasticity, the idea of boundary layer and some simplifying assumptions, an approximate analytical solution was obtained for the normal and shear stresses along the fiber. The analytical solution to the problem was found for the case when the length of the embedded fiber is much greater than its radius, and the Young's modulus of the matrix was much less than that of the fiber. The analytical solution was then compared with a numerical solution based on Finite Element Analysis (FEA) using ANSYS. The numerical results showed the same qualitative behavior of the analytical solution, serving as a validation tool against lack of experimental results. In general this work provides a simple method to determine the thermal stresses along the fiber embedded in a matrix, which is the foundation for a better understanding of the interaction between the fiber and matrix in the case of the classical problem of thermal-stresses.
Resumo:
Background/significance. The scarcity of reliable and valid Spanish language instruments for health related research has hindered research with the Hispanic population. Research suggests that fatalistic attitudes are related to poor cancer screening behaviors and may be one reason for low participation of Mexican-Americans in cancer screening. This problem is of major concern because Mexican-Americans constitute the largest Hispanic subgroup in the U.S.^ Purpose. The purposes of this study were: (1) To translate the Powe Fatalism Inventory, (PFI) into Spanish, and culturally adapt the instrument to the Mexican-American culture as found along the U.S.-Mexico border and (2) To test the equivalence between the Spanish translated, culturally adapted version of the PFI and the English version of the PFI to include clarity, content validity, reading level and reliability.^ Design. Descriptive, cross-sectional.^ Methods. The Spanish language translation used a translation model which incorporates a cultural adaptation process. The SPFI was administered to 175 bilingual participants residing in a midsize, U.S-Mexico border city. Data analysis included estimation of Cronbach's alpha, factor analysis, paired samples t-test comparison and multiple regression analysis using SPSS software, as well as measurement of content validity and reading level of the SPFI. ^ Findings. A reliability estimate using Cronbach's alpha coefficient was 0.81 for the SPFI compared to 0.80 for the PFI in this study. Factor Analysis extracted four factors which explained 59% of the variance. Paired t-test comparison revealed no statistically significant differences between the SPFI and PFI total or individual item scores. Content Validity Index was determined to be 1.0. Reading Level was assessed to be less than a 6th grade reading level. The correlation coefficient between the SPFI and PFI was 0.95.^ Conclusions. This study provided strong psychometric evidence that the Spanish translated, culturally adapted SPFI is an equivalent tool to the English version of the PFI in measuring cancer fatalism. This indicates that the two forms of the instrument can be used interchangeably in a single study to accommodate reading and speaking abilities of respondents. ^
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
The activity of validating identified requirements for an information system helps to improve the quality of a requirements specification document and, consequently, the success of a project. Although various different support tools to requirements engineering exist in the market, there is still a lack of automated support for validation activity. In this context, the purpose of this paper is to make up for that deficiency, with the use of an automated tool, to provide the resources for the execution of an adequate validation activity. The contribution of this study is to enable an agile and effective follow-up of the scope established for the requirements, so as to lead the development to a solution which would satisfy the real necessities of the users, as well as to supply project managers with relevant information about the maturity of the analysts involved in requirements specification.
Resumo:
The Douleur Neuropathique 4 (DN4) questionnaire was developed by the French Neuropathic Pain Group and is a simple and objective tool, with the ability to distinguish nociceptive from neuropathic pain. The purpose of this work was to validate the DN4 questionnaire in the Portuguese language in order to allow its use in clinical and research settings. A double-blind, accuracy study was conducted, consisting of translation, back-translation, literal evaluation, semantic equivalence, and communication with the target population. The Portuguese version of the questionnaire was applied in a sample of 101 patients with neuropathic (N = 42) or nociceptive pain (N = 59), ranked according to medical diagnosis. The reproducibility, reliability and validity of the instrument were analyzed, and showed a high diagnostic power for this version of the DN4 questionnaire. The Portuguese version of the DN4 questionnaire presented good validity and reliability, allowing it to identify neuropathic pain and neuropathic characteristics of mixed pain syndromes. Perspective: This article presents the first validated neuropathic pain questionnaire in the Portuguese language and represents a useful tool in the assessment of neuropathic pain both in the clinical setting and in population-based studies. The sensible and quick format of this instrument are key factors that will contribute to its widespread use, permitting a true recognition of patients with neuropathic pain. (C) 2010 by the American Pain Society
Resumo:
Measuring the height of the vertical jump is an indicator of the strength and power of the lower body. The technological tools available to measure the vertical jump are black boxes and are not open to third-party verification or adaptation. We propose the creation of a measurement system called Chronojump-Boscosystem, consisting of open hardware and free software. Methods: A microcontroller was created and validated using a square wave generator and an oscilloscope. Two types of contact platforms were developed using different materials. These platforms were validated by the minimum pressure required for activation at different points by a strain gauge, together with the on/off time of our platforms in respect of the Ergojump-Boscosystem platform by a sample of 8 subjects performing submaximal jumps with one foot on each platform. Agile methodologies were used to develop and validate the software. Results: All the tools fall under the free software / open hardware guidelines and are, in that sense, free. The microcontroller margin of error is 0.1%. The validity of the fiberglass platform is 0.95 (ICC). The management software contains nearly 113.000 lines of code and is available in 7 languages.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.
Resumo:
Background: The gap between what is known and what is practiced results in health service users not benefitting from advances in healthcare, and in unnecessary costs. A supportive context is considered a key element for successful implementation of evidence-based practices (EBP). There were no tools available for the systematic mapping of aspects of organizational context influencing the implementation of EBPs in low- and middle-income countries (LMICs). Thus, this project aimed to develop and psychometrically validate a tool for this purpose. Methods: The development of the Context Assessment for Community Health (COACH) tool was premised on the context dimension in the Promoting Action on Research Implementation in Health Services framework, and is a derivative product of the Alberta Context Tool. Its development was undertaken in Bangladesh, Vietnam, Uganda, South Africa and Nicaragua in six phases: (1) defining dimensions and draft tool development, (2) content validity amongst in-country expert panels, (3) content validity amongst international experts, (4) response process validity, (5) translation and (6) evaluation of psychometric properties amongst 690 health workers in the five countries. Results: The tool was validated for use amongst physicians, nurse/midwives and community health workers. The six phases of development resulted in a good fit between the theoretical dimensions of the COACH tool and its psychometric properties. The tool has 49 items measuring eight aspects of context: Resources, Community engagement, Commitment to work, Informal payment, Leadership, Work culture, Monitoring services for action and Sources of knowledge. Conclusions: Aspects of organizational context that were identified as influencing the implementation of EBPs in high-income settings were also found to be relevant in LMICs. However, there were additional aspects of context of relevance in LMICs specifically Resources, Community engagement, Commitment to work and Informal payment. Use of the COACH tool will allow for systematic description of the local healthcare context prior implementing healthcare interventions to allow for tailoring implementation strategies or as part of the evaluation of implementing healthcare interventions and thus allow for deeper insights into the process of implementing EBPs in LMICs.