850 resultados para Ontology validation
Resumo:
GimC/Prefoldin is a hetero-oligomeric complex involved in cytoskeleton biogenesis. In order to identify by two-hybrid system targets that directly interact with Gims and support the stress phenotypes, this work aimed the functional validation of all Gims in saccharomyces cerevisiae.
Resumo:
Purpose - To develop and validate a psychometric scale for assessing image quality perception for chest X-ray images. Methods - Bandura's theory was used to guide scale development. A review of the literature was undertaken to identify items/factors which could be used to evaluate image quality using a perceptual approach. A draft scale was then created (22 items) and presented to a focus group (student and qualified radiographers). Within the focus group the draft scale was discussed and modified. A series of seven postero-anterior chest images were generated using a phantom with a range of image qualities. Image quality perception was confirmed for the seven images using signal-to-noise ratio (SNR 17.2–36.5). Participants (student and qualified radiographers and radiology trainees) were then invited to independently score each of the seven images using the draft image quality perception scale. Cronbach alpha was used to test interval reliability. Results - Fifty three participants used the scale to grade image quality perception on each of the seven images. Aggregated mean scale score increased with increasing SNR from 42.1 to 87.7 (r = 0.98, P < 0.001). For each of the 22 individual scale items there was clear differentiation of low, mid and high quality images. A Cronbach alpha coefficient of >0.7 was obtained across each of the seven images. Conclusion - This study represents the first development of a chest image quality perception scale based on Bandura's theory. There was excellent correlation between the image quality perception scores derived using the scale and the SNR. Further research will involve a more detailed item and factor analysis.
Resumo:
A new procedure for determining eleven organochlorine pesticides in soils using microwave-assisted extraction (MAE) and headspace solid phase microextraction (HS-SPME) is described. The studied pesticides consisted of mirex, α- and γ-chlordane, p,p’-DDT, heptachlor, heptachlor epoxide isomer A, γ-hexachlorocyclohexane, dieldrin, endrin, aldrine and hexachlorobenzene. The HS-SPME was optimized for the most important parameters such as extraction time, sample volume and temperature. The present analytical procedure requires a reduced volume of organic solvents and avoids the need for extract clean-up steps. For optimized conditions the limits of detection for the method ranged from 0.02 to 3.6 ng/g, intermediate precision ranged from 14 to 36% (as CV%), and the recovery from 8 up to 51%. The proposed methodology can be used in the rapid screening of soil for the presence of the selected pesticides, and was applied to landfill soil samples.
Resumo:
Amulti-residue methodology based on a solid phase extraction followed by gas chromatography–tandem mass spectrometry was developed for trace analysis of 32 compounds in water matrices, including estrogens and several pesticides from different chemical families, some of them with endocrine disrupting properties. Matrix standard calibration solutions were prepared by adding known amounts of the analytes to a residue-free sample to compensate matrix-induced chromatographic response enhancement observed for certain pesticides. Validation was done mainly according to the International Conference on Harmonisation recommendations, as well as some European and American validation guidelines with specifications for pesticides analysis and/or GC–MS methodology. As the assumption of homoscedasticity was not met for analytical data, weighted least squares linear regression procedure was applied as a simple and effective way to counteract the greater influence of the greater concentrations on the fitted regression line, improving accuracy at the lower end of the calibration curve. The method was considered validated for 31 compounds after consistent evaluation of the key analytical parameters: specificity, linearity, limit of detection and quantification, range, precision, accuracy, extraction efficiency, stability and robustness.
Resumo:
Introdução Hoje em dia, o conceito de ontologia (Especificação explícita de uma conceptualização [Gruber, 1993]) é um conceito chave em sistemas baseados em conhecimento em geral e na Web Semântica em particular. Entretanto, os agentes de software nem sempre concordam com a mesma conceptualização, justificando assim a existência de diversas ontologias, mesmo que tratando o mesmo domínio de discurso. Para resolver/minimizar o problema de interoperabilidade entre estes agentes, o mapeamento de ontologias provou ser uma boa solução. O mapeamento de ontologias é o processo onde são especificadas relações semânticas entre entidades da ontologia origem e destino ao nível conceptual, e que por sua vez podem ser utilizados para transformar instâncias baseadas na ontologia origem em instâncias baseadas na ontologia destino. Motivação Num ambiente dinâmico como a Web Semântica, os agentes alteram não só os seus dados mas também a sua estrutura e semântica (ontologias). Este processo, denominado evolução de ontologias, pode ser definido como uma adaptação temporal da ontologia através de alterações que surgem no domínio ou nos objectivos da própria ontologia, e da gestão consistente dessas alterações [Stojanovic, 2004], podendo por vezes deixar o documento de mapeamento inconsistente. Em ambientes heterogéneos onde a interoperabilidade entre sistemas depende do documento de mapeamento, este deve reflectir as alterações efectuadas nas ontologias, existindo neste caso duas soluções: (i) gerar um novo documento de mapeamento (processo exigente em termos de tempo e recursos computacionais) ou (ii) adaptar o documento de mapeamento, corrigindo relações semânticas inválidas e criar novas relações se forem necessárias (processo menos existente em termos de tempo e recursos computacionais, mas muito dependente da informação sobre as alterações efectuadas). O principal objectivo deste trabalho é a análise, especificação e desenvolvimento do processo de evolução do documento de mapeamento de forma a reflectir as alterações efectuadas durante o processo de evolução de ontologias. Contexto Este trabalho foi desenvolvido no contexto do MAFRA Toolkit1. O MAFRA (MApping FRAmework) Toolkit é uma aplicação desenvolvida no GECAD2 que permite a especificação declarativa de relações semânticas entre entidades de uma ontologia origem e outra de destino, utilizando os seguintes componentes principais: Concept Bridge – Representa uma relação semântica entre um conceito de origem e um de destino; Property Bridge – Representa uma relação semântica entre uma ou mais propriedades de origem e uma ou mais propriedades de destino; Service – São aplicados às Semantic Bridges (Property e Concept Bridges) definindo como as instâncias origem devem ser transformadas em instâncias de destino. Estes conceitos estão especificados na ontologia SBO (Semantic Bridge Ontology) [Silva, 2004]. No contexto deste trabalho, um documento de mapeamento é uma instanciação do SBO, contendo relações semânticas entre entidades da ontologia de origem e da ontologia de destino. Processo de evolução do mapeamento O processo de evolução de mapeamento é o processo onde as entidades do documento de mapeamento são adaptadas, reflectindo eventuais alterações nas ontologias mapeadas, tentando o quanto possível preservar a semântica das relações semântica especificadas. Se as ontologias origem e/ou destino sofrerem alterações, algumas relações semânticas podem tornar-se inválidas, ou novas relações serão necessárias, sendo por isso este processo composto por dois sub-processos: (i) correcção de relações semânticas e (ii) processamento de novas entidades das ontologias. O processamento de novas entidades das ontologias requer a descoberta e cálculo de semelhanças entre entidades e a especificação de relações de acordo com a ontologia/linguagem SBO. Estas fases (“similarity measure” e “semantic bridging”) são implementadas no MAFRA Toolkit, sendo o processo (semi-) automático de mapeamento de ontologias descrito em [Silva, 2004].O processo de correcção de entidades SBO inválidas requer um bom conhecimento da ontologia/linguagem SBO, das suas entidades e relações, e de todas as suas restrições, i.e. da sua estrutura e semântica. Este procedimento consiste em (i) identificar as entidades SBO inválidas, (ii) a causa da sua invalidez e (iii) corrigi-las da melhor forma possível. Nesta fase foi utilizada informação vinda do processo de evolução das ontologias com o objectivo de melhorar a qualidade de todo o processo. Conclusões Para além do processo de evolução do mapeamento desenvolvido, um dos pontos mais importantes deste trabalho foi a aquisição de um conhecimento mais profundo sobre ontologias, processo de evolução de ontologias, mapeamento etc., expansão dos horizontes de conhecimento, adquirindo ainda mais a consciência da complexidade do problema em questão, o que permite antever e perspectivar novos desafios para o futuro.
Resumo:
Food lipid major components are usually analyzed by individual methodologies using diverse extractive procedures for each class. A simple and fast extractive procedure was devised for the sequential analysis of vitamin E, cholesterol, fatty acids, and total fat estimation in seafood, reducing analyses time and organic solvent consumption. Several liquid/liquid-based extractive methodologies using chlorinated and non-chlorinated organic solvents were tested. The extract obtained is used for vitamin E quantification (normal-phase HPLC with fluorescence detection), total cholesterol (normal-phase HPLC with UV detection), fatty acid profile, and total fat estimation (GC-FID), all accomplished in <40 min. The final methodology presents an adequate linearity range and sensitivity for tocopherol and cholesterol, with intra- and inter-day precisions (RSD) from 3 to 11 % for all the components. The developed methodology was applied to diverse seafood samples with positive outcomes, making it a very attractive technique for routine analyses in standard equipped laboratories in the food quality control field.
Resumo:
WiDom is a wireless prioritized medium access control protocol which offers a very large number of priority levels. Hence, it brings the potential to employ non-preemptive static-priority scheduling and schedulability analysis for a wireless channel assuming that the overhead of WiDom is modeled properly. One schedulability analysis for WiDom has already been proposed but recent research has created a new version of WiDom (we call it: Slotted WiDom) with lower overhead and for this version of WiDom no schedulability analysis exists. In this paper we propose a new schedulability analysis for slotted WiDom and extend it to work also for message streams with release jitter. We have performed experiments with an implementation of slotted WiDom on a real-world platform (MicaZ). We find that for each message stream, the maximum observed response time never exceeds the calculated response time and hence this corroborates our belief that our new scheduling theory is applicable in practice.
Resumo:
The IEEE 802.15.4 is the most widespread used protocol for Wireless Sensor Networks (WSNs) and it is being used as a baseline for several higher layer protocols such as ZigBee, 6LoWPAN or WirelessHART. Its MAC (Medium Access Control) supports both contention-free (CFP, based on the reservation of guaranteed time-slots GTS) and contention based (CAP, ruled by CSMA/CA) access, when operating in beacon-enabled mode. Thus, it enables the differentiation between real-time and best-effort traffic. However, some WSN applications and higher layer protocols may strongly benefit from the possibility of supporting more traffic classes. This happens, for instance, for dense WSNs used in time-sensitive industrial applications. In this context, we propose to differentiate traffic classes within the CAP, enabling lower transmission delays and higher success probability to timecritical messages, such as for event detection, GTS reservation and network management. Building upon a previously proposed methodology (TRADIF), in this paper we outline its implementation and experimental validation over a real-time operating system. Importantly, TRADIF is fully backward compatible with the IEEE 802.15.4 standard, enabling to create different traffic classes just by tuning some MAC parameters.
Resumo:
Dissertação apresentada para a obtenção do Grau de Doutor em Informática pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
Polyolefins are especially difficult to bond due to their non-polar, non-porous and chemically inert surfaces. Acrylic adhesives used in industry are particularly suited to bond these materials, including many grades of polypropylene (PP) and polyethylene (PE), without special surface preparation. In this work, the tensile strength of single-lap PE and mixed joints bonded with an acrylic adhesive was investigated. The mixed joints included PE with aluminium (AL) or carbon fibre reinforced plastic (CFRP) substrates. The PE substrates were only cleaned with isopropanol, which assured cohesive failures. For the PE CFRP joints, three different surfaces preparations were employed for the CFRP substrates: cleaning with acetone, abrasion with 100 grit sand paper and peel-ply finishing. In the PE AL joints, the AL bonding surfaces were prepared by the following methods: cleaning with acetone, abrasion with 180 and 320 grit sand papers, grit blasting and chemical etching with chromic acid. After abrasion of the CFRP and AL substrates, the surfaces were always cleaned with acetone. The tensile strengths were compared with numerical results from ABAQUS® and a mixed mode (I+II) cohesive damage model. A good agreement was found between the experimental and numerical results, except for the PE AL joints, since the AL surface treatments were not found to be effective.
Resumo:
OBJECTIVE Translate the Patient-centered Assessment and Counseling for Exercise questionnaire, adapt it cross-culturally and identify the psychometric properties of the psychosocial scales for physical activity in young university students.METHODS The Patient-centered Assessment and Counseling for Exercise questionnaire is made up of 39 items divided into constructs based on the social cognitive theory and the transtheoretical model. The analyzed constructs were, as follows: behavior change strategy (15 items), decision-making process (10), self-efficacy (6), support from family (4), and support from friends (4). The validation procedures were conceptual, semantic, operational, and functional equivalences, in addition to the equivalence of the items and of measurements. The conceptual, of items and semantic equivalences were performed by a specialized committee. During measurement equivalence, the instrument was applied to 717 university students. Exploratory factor analysis was used to verify the loading of each item, explained variance and internal consistency of the constructs. Reproducibility was measured by means of intraclass correlation coefficient.RESULTS The two translations were equivalent and back-translation was similar to the original version, with few adaptations. The layout, presentation order of the constructs and items from the original version were kept in the same form as the original instrument. The sample size was adequate and was evaluated by the Kaiser-Meyer-Olkin test, with values between 0.72 and 0.91. The correlation matrix of the items presented r < 0.8 (p < 0.05). The factor loadings of the items from all the constructs were satisfactory (> 0.40), varying between 0.43 and 0.80, which explained between 45.4% and 59.0% of the variance. Internal consistency was satisfactory (α ≥ 0.70), with support from friends being 0.70 and 0.92 for self-efficacy. Most items (74.3%) presented values above 0.70 for the reproducibility test.CONCLUSIONS The validation process steps were considered satisfactory and adequate for applying to the population.
Resumo:
ABSTRACT OBJECTIVE To validate an instrument designed to assess health promotion in the school environment. METHODS A questionnaire, based on guidelines from the World Health Organization and in line with the Brazilian school health context, was developed to validate the research instrument. There were 60 items in the instrument that included 40 questions for the school manager and 20 items with direct observations made by the interviewer. The items’ content validation was performed using the Delphi technique, with the instrument being applied in 53 schools from two medium-sized cities in the South region of Brazil. Reliability (Cronbach’s alpha and split-half) and validity (principal component analysis) analyses were performed. RESULTS The final instrument remained composed of 28 items, distributed into three dimensions: pedagogical, structural and relational. The resulting components showed good factorial loads (> 0.4) and acceptable reliability (> 0.6) for most items. The pedagogical dimension identifies educational activities regarding drugs and sexuality, violence and prejudice, auto care and peace and quality of life. The structural dimension is comprised of access, sanitary structure, and conservation and equipment. The relational dimension includes relationships within the school and with the community. CONCLUSIONS The proposed instrument presents satisfactory validity and reliability values, which include aspects relevant to promote health in schools. Its use allows the description of the health promotion conditions to which students from each educational institution are exposed. Because this instrument includes items directly observed by the investigator, it should only be used during periods when there are full and regular activities at the school in question.
Resumo:
Prototype validation is a major concern in modern electronic product design and development. Simulation, structural test, functional and timing debug are all forming parts of the validation process, although very often addressed as dissociated tasks. In this paper we describe an integrated approach to board-level prototype validation, based on a set of mandatory/optional BST instructions and a built-in controller for debug and test, that addresses the late mentioned tasks as inherent parts of a whole process
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies