981 resultados para Axiomatic formal system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese de Doutoramento em Engenharia de Eletrónica e de Computadores

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Informatik, Diss., 2011

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A review article of the The New England Journal of Medicine refers that almost a century ago, Abraham Flexner, a research scholar at the Carnegie Foundation for the Advancement of Teaching, undertook an assessment of medical education in 155 medical schools in operation in the United States and Canada. Flexner’s report emphasized the nonscientific approach of American medical schools to preparation for the profession, which contrasted with the university-based system of medical education in Germany. At the core of Flexner’s view was the notion that formal analytic reasoning, the kind of thinking integral to the natural sciences, should hold pride of place in the intellectual training of physicians. This idea was pioneered at Harvard University, the University of Michigan, and the University of Pennsylvania in the 1880s, but was most fully expressed in the educational program at Johns Hopkins University, which Flexner regarded as the ideal for medical education. (...)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To assess the change in non-compliant items in prescription orders following the implementation of a computerized physician order entry (CPOE) system named PreDiMed. SETTING: The department of internal medicine (39 and 38 beds) in two regional hospitals in Canton Vaud, Switzerland. METHOD: The prescription lines in 100 pre- and 100 post-implementation patients' files were classified according to three modes of administration (medicines for oral or other non-parenteral uses; medicines administered parenterally or via nasogastric tube; pro re nata (PRN), as needed) and analyzed for a number of relevant variables constitutive of medical prescriptions. MAIN OUTCOME MEASURE: The monitored variables depended on the pharmaceutical category and included mainly name of medicine, pharmaceutical form, posology and route of administration, diluting solution, flow rate and identification of prescriber. RESULTS: In 2,099 prescription lines, the total number of non-compliant items was 2,265 before CPOE implementation, or 1.079 non-compliant items per line. Two-thirds of these were due to missing information, and the remaining third to incomplete information. In 2,074 prescription lines post-CPOE implementation, the number of non-compliant items had decreased to 221, or 0.107 non-compliant item per line, a dramatic 10-fold decrease (chi(2) = 4615; P < 10(-6)). Limitations of the computerized system were the risk for erroneous items in some non-prefilled fields and ambiguity due to a field with doses shown on commercial products. CONCLUSION: The deployment of PreDiMed in two departments of internal medicine has led to a major improvement in formal aspects of physicians' prescriptions. Some limitations of the first version of PreDiMed were unveiled and are being corrected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Since its creation, the Internet has permeated our daily life. The web is omnipresent for communication, research and organization. This exploitation has resulted in the rapid development of the Internet. Nowadays, the Internet is the biggest container of resources. Information databases such as Wikipedia, Dmoz and the open data available on the net are a great informational potentiality for mankind. The easy and free web access is one of the major feature characterizing the Internet culture. Ten years earlier, the web was completely dominated by English. Today, the web community is no longer only English speaking but it is becoming a genuinely multilingual community. The availability of content is intertwined with the availability of logical organizations (ontologies) for which multilinguality plays a fundamental role. In this work we introduce a very high-level logical organization fully based on semiotic assumptions. We thus present the theoretical foundations as well as the ontology itself, named Linguistic Meta-Model. The most important feature of Linguistic Meta-Model is its ability to support the representation of different knowledge sources developed according to different underlying semiotic theories. This is possible because mast knowledge representation schemata, either formal or informal, can be put into the context of the so-called semiotic triangle. In order to show the main characteristics of Linguistic Meta-Model from a practical paint of view, we developed VIKI (Virtual Intelligence for Knowledge Induction). VIKI is a work-in-progress system aiming at exploiting the Linguistic Meta-Model structure for knowledge expansion. It is a modular system in which each module accomplishes a natural language processing task, from terminology extraction to knowledge retrieval. VIKI is a supporting system to Linguistic Meta-Model and its main task is to give some empirical evidence regarding the use of Linguistic Meta-Model without claiming to be thorough.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the present study was to develop titles of Nursing Diagnoses and Outcomes (ND/NO) through the relationship between the terms of the Focus axis, limited to the Circulatory System Process, and the terms of other ICNP® axes and to integrate these terms into an ontology. Titles were developed linking 17 terms of the focus axis, which were evaluated by expert nurses in five Brazilian cities. Titles whose use concordance was above 0.80 were included in the ontology. In total, 89 titles for ND/NO were supported in the literature, and 19 were not supported; 37 were assessed as eligible for use in healthcare practice and were included in the ontology. The construction of ND/NO titles based on the ICNP® and using a formal representation of knowledge is a task that requires deepening concepts used for nursing and adequate classification revisions. The elaborated titles will facilitate the composition of diagnostics that are more consistent with practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interfacial hydrodynamic instabilities arise in a range of chemical systems. One mechanism for instability is the occurrence of unstable density gradients due to the accumulation of reaction products. In this paper we conduct two-dimensional nonlinear numerical simulations for a member of this class of system: the methylene-blue¿glucose reaction. The result of these reactions is the oxidation of glucose to a relatively, but marginally, dense product, gluconic acid, that accumulates at oxygen permeable interfaces, such as the surface open to the atmosphere. The reaction is catalyzed by methylene-blue. We show that simulations help to disassemble the mechanisms responsible for the onset of instability and evolution of patterns, and we demonstrate that some of the results are remarkably consistent with experiments. We probe the impact of the upper oxygen boundary condition, for fixed flux, fixed concentration, or mixed boundary conditions, and find significant qualitative differences in solution behavior; structures either attract or repel one another depending on the boundary condition imposed. We suggest that measurement of the form of the boundary condition is possible via observation of oxygen penetration, and improved product yields may be obtained via proper control of boundary conditions in an engineering setting. We also investigate the dependence on parameters such as the Rayleigh number and depth. Finally, we find that pseudo-steady linear and weakly nonlinear techniques described elsewhere are useful tools for predicting the behavior of instabilities beyond their formal range of validity, as good agreement is obtained with the simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND/RATIONALE: Patient safety is a major concern in healthcare systems worldwide. Although most safety research has been conducted in the inpatient setting, evidence indicates that medical errors and adverse events are a threat to patients in the primary care setting as well. Since information about the frequency and outcomes of safety incidents in primary care is required, the goals of this study are to describe the type, frequency, seasonal and regional distribution of medication incidents in primary care in Switzerland and to elucidate possible risk factors for medication incidents. Label="METHODS AND ANALYSIS" ="METHODS"/> <AbstractText STUDY DESIGN AND SETTING: We will conduct a prospective surveillance study to identify cases of medication incidents among primary care patients in Switzerland over the course of the year 2015. PARTICIPANTS: Patients undergoing drug treatment by 167 general practitioners or paediatricians reporting to the Swiss Federal Sentinel Reporting System. INCLUSION CRITERIA: Any erroneous event, as defined by the physician, related to the medication process and interfering with normal treatment course. EXCLUSION CRITERIA: Lack of treatment effect, adverse drug reactions or drug-drug or drug-disease interactions without detectable treatment error. PRIMARY OUTCOME: Medication incidents. RISK FACTORS: Age, gender, polymedication, morbidity, care dependency, hospitalisation. STATISTICAL ANALYSIS: Descriptive statistics to assess type, frequency, seasonal and regional distribution of medication incidents and logistic regression to assess their association with potential risk factors. Estimated sample size: 500 medication incidents. LIMITATIONS: We will take into account under-reporting and selective reporting among others as potential sources of bias or imprecision when interpreting the results. ETHICS AND DISSEMINATION: No formal request was necessary because of fully anonymised data. The results will be published in a peer-reviewed journal. TRIAL REGISTRATION NUMBER: NCT0229537.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The impact of round-the-clock cerebrospinal fluid (CSF) Gram stain on overnight empirical therapy for suspected central nervous system (CNS) infections was investigated. All consecutive overnight CSF Gram stains between 2006 and 2011 were included. The impact of a positive or a negative test on empirical therapy was evaluated and compared to other clinical and biological indications based on institutional guidelines. Bacterial CNS infection was documented in 51/241 suspected cases. Overnight CSF Gram stain was positive in 24/51. Upon validation, there were two false-positive and one false-negative results. The sensitivity and specificity were 41 and 99 %, respectively. All patients but one had other indications for empirical therapy than Gram stain alone. Upon obtaining the Gram result, empirical therapy was modified in 7/24, including the addition of an appropriate agent (1), addition of unnecessary agents (3) and simplification of unnecessary combination therapy (3/11). Among 74 cases with a negative CSF Gram stain and without formal indication for empirical therapy, antibiotics were withheld in only 29. Round-the-clock CSF Gram stain had a low impact on overnight empirical therapy for suspected CNS infections and was associated with several misinterpretation errors. Clinicians showed little confidence in CSF direct examination for simplifying or withholding therapy before definite microbiological results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The design methods and languages targeted to modern System-on-Chip designs are facing tremendous pressure of the ever-increasing complexity, power, and speed requirements. To estimate any of these three metrics, there is a trade-off between accuracy and abstraction level of detail in which a system under design is analyzed. The more detailed the description, the more accurate the simulation will be, but, on the other hand, the more time consuming it will be. Moreover, a designer wants to make decisions as early as possible in the design flow to avoid costly design backtracking. To answer the challenges posed upon System-on-chip designs, this thesis introduces a formal, power aware framework, its development methods, and methods to constraint and analyze power consumption of the system under design. This thesis discusses on power analysis of synchronous and asynchronous systems not forgetting the communication aspects of these systems. The presented framework is built upon the Timed Action System formalism, which offer an environment to analyze and constraint the functional and temporal behavior of the system at high abstraction level. Furthermore, due to the complexity of System-on-Chip designs, the possibility to abstract unnecessary implementation details at higher abstraction levels is an essential part of the introduced design framework. With the encapsulation and abstraction techniques incorporated with the procedure based communication allows a designer to use the presented power aware framework in modeling these large scale systems. The introduced techniques also enable one to subdivide the development of communication and computation into own tasks. This property is taken into account in the power analysis part as well. Furthermore, the presented framework is developed in a way that it can be used throughout the design project. In other words, a designer is able to model and analyze systems from an abstract specification down to an implementable specification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract: In this article we analyze the key concept of Hilbert's axiomatic method, namely that of axiom. We will find two different concepts: the first one from the period of Hilbert's foundation of geometry and the second one at the time of the development of his proof theory. Both conceptions are linked to two different notions of intuition and show how Hilbert's ideas are far from a purely formalist conception of mathematics. The principal thesis of this article is that one of the main problems that Hilbert encountered in his foundational studies consisted in securing a link between formalization and intuition. We will also analyze a related problem, that we will call "Frege's Problem", form the time of the foundation of geometry and investigate the role of the Axiom of Completeness in its solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Leadership is essential for the effectiveness of the teams and organizations they are part of. The challenges facing organizations today require an exhaustive review of the strategic role of leadership. In this context, it is necessary to explore new types of leadership capable of providing an effective response to new needs. The presentday situations, characterized by complexity and ambiguity, make it difficult for an external leader to perform all leadership functions successfully. Likewise, knowledge-based work requires providing professional groups with sufficient autonomy to perform leadership functions. This study focuses on shared leadership in the team context. Shared leadership is seen as an emergent team property resulting from the distribution of leadership influence across multiple team members. Shared leadership entails sharing power and influence broadly among the team members rather than centralizing it in the hands of a single individual who acts in the clear role of a leader. By identifying the team itself as a key source of influence, this study points to the relational nature of leadership as a social construct where leadership is seen as social process of relating processes that are co-constructed by several team members. Based on recent theoretical developments concerned with relational, practice-based and constructionist approaches to the study of leadership processes, this thesis proposes the study of leadership interactions, working processes and practices to focus on the construction of direction, alignment and commitment. During the research process, critical events, activities, working processes and practices of a case team have been examined and analyzed with the grounded theory –approach in the terms of shared leadership. There are a variety of components to this complex process and a multitude of factors that may influence the development of shared leadership. The study suggests that the development process of shared leadership is a common sense -making process and consists of four overlapping dimensions (individual, social, structural, and developmental) to work with as a team. For shared leadership to emerge, the members of the team must offer leadership services, and the team as a whole must be willing to rely on leadership by multiple team members. For these individual and collective behaviors to occur, the team members must believe that offering influence to and accepting it from fellow team members are welcome and constructive actions. Leadership emerges when people with differing world views use dialogue and collaborative learning to create spaces where a shared common purpose can be achieved while a diversity of perspectives is preserved and valued. This study also suggests that this process can be supported by different kinds of meaning-making and process tools. Leadership, then, does not reside in a person or in a role, but in the social system. The built framework integrates the different dimensions of shared leadership and describes their relationships. This way, the findings of this study can be seen as a contribution to the understanding of what constitutes essential aspects of shared leadership in the team context that can be of theoretical value in terms of advancing the adoption and development process of shared leadership. In the real world, teams and organizations can create conditions to foster and facilitate the process. We should encourage leaders and team members to approach leadership as a collective effort that the team can be prepared for, so that the response is rapid and efficient.