73 resultados para Conformance
Resumo:
Background: Ontologies have increasingly been used in the biomedical domain, which has prompted the emergence of different initiatives to facilitate their development and integration. The Open Biological and Biomedical Ontologies (OBO) Foundry consortium provides a repository of life-science ontologies, which are developed according to a set of shared principles. This consortium has developed an ontology called OBO Relation Ontology aiming at standardizing the different types of biological entity classes and associated relationships. Since ontologies are primarily intended to be used by humans, the use of graphical notations for ontology development facilitates the capture, comprehension and communication of knowledge between its users. However, OBO Foundry ontologies are captured and represented basically using text-based notations. The Unified Modeling Language (UML) provides a standard and widely-used graphical notation for modeling computer systems. UML provides a well-defined set of modeling elements, which can be extended using a built-in extension mechanism named Profile. Thus, this work aims at developing a UML profile for the OBO Relation Ontology to provide a domain-specific set of modeling elements that can be used to create standard UML-based ontologies in the biomedical domain. Results: We have studied the OBO Relation Ontology, the UML metamodel and the UML profiling mechanism. Based on these studies, we have proposed an extension to the UML metamodel in conformance with the OBO Relation Ontology and we have defined a profile that implements the extended metamodel. Finally, we have applied the proposed UML profile in the development of a number of fragments from different ontologies. Particularly, we have considered the Gene Ontology (GO), the PRotein Ontology (PRO) and the Xenopus Anatomy and Development Ontology (XAO). Conclusions: The use of an established and well-known graphical language in the development of biomedical ontologies provides a more intuitive form of capturing and representing knowledge than using only text-based notations. The use of the profile requires the domain expert to reason about the underlying semantics of the concepts and relationships being modeled, which helps preventing the introduction of inconsistencies in an ontology under development and facilitates the identification and correction of errors in an already defined ontology.
Resumo:
Service Oriented Computing is a new programming paradigm for addressing distributed system design issues. Services are autonomous computational entities which can be dynamically discovered and composed in order to form more complex systems able to achieve different kinds of task. E-government, e-business and e-science are some examples of the IT areas where Service Oriented Computing will be exploited in the next years. At present, the most credited Service Oriented Computing technology is that of Web Services, whose specifications are enriched day by day by industrial consortia without following a precise and rigorous approach. This PhD thesis aims, on the one hand, at modelling Service Oriented Computing in a formal way in order to precisely define the main concepts it is based upon and, on the other hand, at defining a new approach, called bipolar approach, for addressing system design issues by synergically exploiting choreography and orchestration languages related by means of a mathematical relation called conformance. Choreography allows us to describe systems of services from a global view point whereas orchestration supplies a means for addressing such an issue from a local perspective. In this work we present SOCK, a process algebra based language inspired by the Web Service orchestration language WS-BPEL which catches the essentials of Service Oriented Computing. From the definition of SOCK we will able to define a general model for dealing with Service Oriented Computing where services and systems of services are related to the design of finite state automata and process algebra concurrent systems, respectively. Furthermore, we introduce a formal language for dealing with choreography. Such a language is equipped with a formal semantics and it forms, together with a subset of the SOCK calculus, the bipolar framework. Finally, we present JOLIE which is a Java implentation of a subset of the SOCK calculus and it is part of the bipolar framework we intend to promote.
Resumo:
Interaction protocols establish how different computational entities can interact with each other. The interaction can be finalized to the exchange of data, as in 'communication protocols', or can be oriented to achieve some result, as in 'application protocols'. Moreover, with the increasing complexity of modern distributed systems, protocols are used also to control such a complexity, and to ensure that the system as a whole evolves with certain features. However, the extensive use of protocols has raised some issues, from the language for specifying them to the several verification aspects. Computational Logic provides models, languages and tools that can be effectively adopted to address such issues: its declarative nature can be exploited for a protocol specification language, while its operational counterpart can be used to reason upon such specifications. In this thesis we propose a proof-theoretic framework, called SCIFF, together with its extensions. SCIFF is based on Abductive Logic Programming, and provides a formal specification language with a clear declarative semantics (based on abduction). The operational counterpart is given by a proof procedure, that allows to reason upon the specifications and to test the conformance of given interactions w.r.t. a defined protocol. Moreover, by suitably adapting the SCIFF Framework, we propose solutions for addressing (1) the protocol properties verification (g-SCIFF Framework), and (2) the a-priori conformance verification of peers w.r.t. the given protocol (AlLoWS Framework). We introduce also an agent based architecture, the SCIFF Agent Platform, where the same protocol specification can be used to program and to ease the implementation task of the interacting peers.
Resumo:
In dieser Arbeit wurden wässrige Suspensionen ladungsstabilisierter kolloidaler Partikel bezüglich ihres Verhaltens unter dem Einfluss elektrischer Felder untersucht. Insbesondere wurde die elektrophoretische Mobilität µ über einen weiten Partikelkonzentrationsbereich studiert, um das individuelle Verhalten einzelner Partikel mit dem bisher nur wenig untersuchten kollektiven Verhalten von Partikelensembles (speziell von fluid oder kristallin geordneten Ensembles) zu vergleichen. Dazu wurde ein superheterodynes Dopplervelocimetrisches Lichtstreuexperiment mit integraler und lokaler Datenerfassung konzipiert, das es erlaubt, die Geschwindigkeit der Partikel in elektrischen Feldern zu studieren. Das Experiment wurde zunächst erfolgreich im Bereich nicht-ordnender und fluid geordneter Suspensionen getestet. Danach konnte mit diesem Gerät erstmals das elektrophoretische Verhalten von kristallin geordneten Suspensionen untersucht werden. Es wurde ein komplexes Fließverhalten beobachtet und ausführlich dokumentiert. Dabei wurden bisher in diesem Zusammenhang noch nicht beobachtete Effekte wie Blockfluss, Scherbandbildung, Scherschmelzen oder elastische Resonanzen gefunden. Andererseits machte dieses Verhalten die Entwicklung einer neuen Auswertungsroutine für µ im kristallinen Zustand notwendig, wozu die heterodyne Lichtstreutheorie auf den superheterodynen Fall mit Verscherung erweitert werden musste. Dies wurde zunächst für nicht geordnete Systeme durchgeführt. Diese genäherte Beschreibung genügte, um unter den gegebenen Versuchbedingungen auch das Lichtstreuverhalten gescherter kristalliner Systeme zu interpretieren. Damit konnte als weiteres wichtiges Resultat eine generelle Mobilitäts-Konzentrations-Kurve erhalten werden. Diese zeigt bei geringen Partikelkonzentrationen den bereits bekannten Anstieg und bei mittleren Konzentrationen ein Plateau. Bei hohen Konzentrationen sinkt die Mobilität wieder ab. Zur Interpretation dieses Verhaltens bzgl. Partikelladung stehen derzeit nur Theorien für nicht wechselwirkende Partikel zur Verfügung. Wendet man diese an, so findet man eine überraschend gute Übereinstimmung der elektrophoretisch bestimmten Partikelladung Z*µ mit numerisch bestimmten effektiven Partikelladungen Z*PBC.
Resumo:
I documenti cartacei vengono attualmente rimpiazzati dalle loro versioni elettroniche, che contengono anche alcune caratteristiche biometriche; questo ha permesso il controllo automatico, sia quando il documento viene rilasciato, sia quando l'identità della persona deve essere verificata. Per rendere questo possibile è necessario che la fotografia rispetti degli standard di qualità. Lo standard ISO/IEC 19794-5 fornisce alcune guide linea ed esempi di immagini di volto accettabili e non-accettabili. Negli ultimi anni, molte aziende hanno sviluppato SDK con lo scopo di implementare i test proposti dallo standard. La tesi si prefigura il compito di fornire un framework che fornisca buone prestazioni, sia per quanto riguarda i tempi sia per l'accuratezza dei risultati.
Resumo:
BACKGROUND E-learning and blended learning approaches gain more and more popularity in emergency medicine curricula. So far, little data is available on the impact of such approaches on procedural learning and skill acquisition and their comparison with traditional approaches. OBJECTIVE This study investigated the impact of a blended learning approach, including Web-based virtual patients (VPs) and standard pediatric basic life support (PBLS) training, on procedural knowledge, objective performance, and self-assessment. METHODS A total of 57 medical students were randomly assigned to an intervention group (n=30) and a control group (n=27). Both groups received paper handouts in preparation of simulation-based PBLS training. The intervention group additionally completed two Web-based VPs with embedded video clips. Measurements were taken at randomization (t0), after the preparation period (t1), and after hands-on training (t2). Clinical decision-making skills and procedural knowledge were assessed at t0 and t1. PBLS performance was scored regarding adherence to the correct algorithm, conformance to temporal demands, and the quality of procedural steps at t1 and t2. Participants' self-assessments were recorded in all three measurements. RESULTS Procedural knowledge of the intervention group was significantly superior to that of the control group at t1. At t2, the intervention group showed significantly better adherence to the algorithm and temporal demands, and better procedural quality of PBLS in objective measures than did the control group. These aspects differed between the groups even at t1 (after VPs, prior to practical training). Self-assessments differed significantly only at t1 in favor of the intervention group. CONCLUSIONS Training with VPs combined with hands-on training improves PBLS performance as judged by objective measures.
Resumo:
In an increasing number of applications (e.g., in embedded, real-time, or mobile systems) it is important or even essential to ensure conformance with respect to a specification expressing resource usages, such as execution time, memory, energy, or user-defined resources. In previous work we have presented a novel framework for data size-aware, static resource usage verification. Specifications can include both lower and upper bound resource usage functions. In order to statically check such specifications, both upper- and lower-bound resource usage functions (on input data sizes) approximating the actual resource usage of the program which are automatically inferred and compared against the specification. The outcome of the static checking of assertions can express intervals for the input data sizes such that a given specification can be proved for some intervals but disproved for others. After an overview of the approach in this paper we provide a number of novel contributions: we present a full formalization, and we report on and provide results from an implementation within the Ciao/CiaoPP framework (which provides a general, unified platform for static and run-time verification, as well as unit testing). We also generalize the checking of assertions to allow preconditions expressing intervals within which the input data size of a program is supposed to lie (i.e., intervals for which each assertion is applicable), and we extend the class of resource usage functions that can be checked.
Resumo:
Service compositions put together loosely-coupled component services to perform more complex, higher level, or cross-organizational tasks in a platform-independent manner. Quality-of-Service (QoS) properties, such as execution time, availability, or cost, are critical for their usability, and permissible boundaries for their values are defined in Service Level Agreements (SLAs). We propose a method whereby constraints that model SLA conformance and violation are derived at any given point of the execution of a service composition. These constraints are generated using the structure of the composition and properties of the component services, which can be either known or empirically measured. Violation of these constraints means that the corresponding scenario is unfeasible, while satisfaction gives values for the constrained variables (start / end times for activities, or number of loop iterations) which make the scenario possible. These results can be used to perform optimized service matching or trigger preventive adaptation or healing.
Resumo:
Context. This thesis is framed in experimental software engineering. More concretely, it addresses the problems arisen when assessing process conformance in test-driven development experiments conducted by UPM's Experimental Software Engineering group. Process conformance was studied using the Eclipse's plug-in tool Besouro. It has been observed that Besouro does not work correctly in some circumstances. It creates doubts about the correction of the existing experimental data which render it useless. Aim. The main objective of this work is the identification and correction of Besouro's faults. A secondary goal is fixing the datasets already obtained in past experiments to the maximum possible extent. This way, existing experimental results could be used with confidence. Method. (1) Testing Besouro using different sequences of events (creation methods, assertions etc..) to identify the underlying faults. (2) Fix the code and (3) fix the datasets using code specially created for this purpose. Results. (1) We confirmed the existence of several fault in Besouro's code that affected to Test-First and Test-Last episode identification. These faults caused the incorrect identification of 20% of episodes. (2) We were able to fix Besouro's code. (3) The correction of existing datasets was possible, subjected to some restrictions (such us the impossibility of tracing code size increase to programming time. Conclusion. The results of past experiments dependent upon Besouro's data could no be trustable. We have the suspicion that more faults remain in Besouro's code, whose identification requires further analysis.
Resumo:
Este trabalho de pesquisa apresenta a Metodologia para a Homologação dos Equipamentos do Sistema Canal Azul da Carne - MHECAC. Esta proposta de metodologia é complementar ao desenvolvimento do Sistema Canal Azul da Carne e tem o objetivo de apoiar o MAPA (Ministério da Agricultura, Pecuária e Abastecimento) no estabelecimento de um processo de verificação da conformidade de equipamentos, buscando garantir a interoperabilidade, o desempenho e a segurança de seus componentes de hardware. O Sistema Canal Azul é uma realização do MAPA em conjunto com o GAESI da Escola Politécnica da Universidade de São Paulo (EPUSP) e a iniciativa privada, e tem o objetivo de reduzir o tempo empregado nos processos de exportação de carnes no Brasil. A MHECAC baseia-se na estrutura de processo de avaliação de conformidade estabelecida pelo Sistema Brasileiro de Avaliação da Conformidade - SBAC e seus requisitos gerais podem ser aplicados à avaliação de conformidade de produtos em setores variados. No desenvolvimento da MHECAC foram aplicadas as principais referências técnicas e normativas correspondentes aos equipamentos que compõe a arquitetura do sistema Canal Azul. Além disto, foram definidos os modelos de homologação, auditoria e inspeção, os planos de amostragem, os requisitos mínimos e a metodologia de ensaio. A MHECAC subdivide-se em dois segmentos principais. O primeiro apresenta os requisitos gerais para o estabelecimento de sistemas de avaliação da conformidade e certificação de produtos, a aplicação destes requisitos não se limita ao Sistema Canal Azul, e o segundo apresenta requisitos específicos ao sistema estabelecido pelo MAPA. A aplicação da MHECAC favorece o tratamento isonômico de fornecedores e é um importante balizador para a seleção de equipamentos, pois permite a qualificação e a comparação de soluções, por meio de um embasamento técnico, pautado pela qualidade.
Resumo:
Diversity-based designing, or the goal of ensuring that web-based information is accessible to as many diverse users as possible, has received growing international acceptance in recent years, with many countries introducing legislation to enforce it. This paper analyses web content accessibility levels in Spanish education portals according to the international guidelines established by the World Wide Web Consortium (W3C) and the Web Accessibility Initiative (WAI). Additionally, it suggests the calculation of an inaccessibility rate as a tool for measuring the degree of non-compliance with WAI Guidelines 2.0 as well as illustrating the significant gap that separates people with disabilities from digital education environments (with a 7.77% average). A total of twenty-one educational web portals with two different web depth levels (42 sampling units) were assessed for this purpose using the automated analysis tool Web Accessibility Test 2.0 (TAW, for its initials in Spanish). The present study reveals a general trend towards non-compliance with the technical accessibility recommendations issued by the W3C-WAI group (97.62% of the websites examined present mistakes in Level A conformance). Furthermore, despite the increasingly high number of legal and regulatory measures about accessibility, their practical application still remains unsatisfactory. A greater level of involvement must be assumed in order to raise awareness and enhance training efforts towards accessibility in the context of collective Information and Communication Technologies (ICTs), since this represents not only a necessity but also an ethical, social, political and legal commitment to be assumed by society.
Resumo:
Architectural decisions are often encoded in the form of constraints and guidelines. Non-functional requirements can be ensured by checking the conformance of the implementation against this kind of invariant. Conformance checking is often a costly and error-prone process that involves the use of multiple tools, differing in effectiveness, complexity and scope of applicability. To reduce the overall effort entailed by this activity, we propose a novel approach that supports verification of human- readable declarative rules through the use of adapted off-the-shelf tools. Our approach consists of a rule specification DSL, called Dicto, and a tool coordination framework, called Probo. The approach has been implemented in a soon to be evaluated prototype.
Resumo:
Cover title.
Resumo:
"This is our report of the Management Audit of the Department of Central Management Services' Administration of the State's Space Utilization Program. The audit was conducted pursuant to Legislative Audit Commission Resolution Number 126, which was adopted December 11, 2002. This audit was conducted in accordance with generally accepted government auditing standards and the audit standards promulgated by the Office of the Auditor General at 74 Ill. Adm. Code-420.310. The audit report is transmitted in conformance with Section 3-14 of the Illinois State Auditing Act."--Cover letter.
Resumo:
"The audit was conducted pursuant to Legislative Audit Commission Resolution Number 125, which was adopted December 11, 2002. This audit was conducted in accordance with generally accepted government auditing standards and the audit standards promulgated by the Office of the Auditor General at 74 Ill. Adm. Code 420.310. This audit report is transmitted in conformance with Section 3-14 of the Illinois State Auditing Act."--Cover letter.