983 resultados para automated knowledge visualization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Undergraduate medical education is moving from traditional disciplinary basic science courses into more integrated curricula. Integration models based on organ systems originated in the 1950s, but few longitudinal studies have evaluated their effectiveness. This article outlines the development and implementation of the Organic and Functional Systems (OFS) courses at the University of Minho in Portugal, using evidence collected over 10 years. It describes the organization of content, student academic performance and acceptability of the courses, the evaluation of preparedness for future courses and the retention of knowledge on basic sciences. Students consistently rated the OFS courses highly. Physician tutors in subsequent clinical attachments considered that students were appropriately prepared. Performance in the International Foundations of Medicine examination of a self-selected sample of students revealed similar performances in basic science items after the last OFS course and 4 years later, at the moment of graduation. In conclusion, the organizational and pedagogical approaches of the OFS courses achieve high acceptability by students and result in positive outcomes in terms of preparedness for subsequent training and long-term retention of basic science knowledge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an automated optimization framework able to provide network administrators with resilient routing configurations for link-state protocols, such as OSPF or IS-IS. In order to deal with the formulated NP-hard optimization problems, the devised framework is underpinned by the use of computational intelligence optimization engines, such as Multi-objective Evolutionary Algorithms (MOEAs). With the objective of demonstrating the framework capabilities, two illustrative Traffic Engineering methods are described, allowing to attain routing configurations robust to changes in the traffic demands and maintaining the network stable even in the presence of link failure events. The presented illustrative results clearly corroborate the usefulness of the proposed automated framework along with the devised optimization methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genome-scale metabolic models are valuable tools in the metabolic engineering process, based on the ability of these models to integrate diverse sources of data to produce global predictions of organism behavior. At the most basic level, these models require only a genome sequence to construct, and once built, they may be used to predict essential genes, culture conditions, pathway utilization, and the modifications required to enhance a desired organism behavior. In this chapter, we address two key challenges associated with the reconstruction of metabolic models: (a) leveraging existing knowledge of microbiology, biochemistry, and available omics data to produce the best possible model; and (b) applying available tools and data to automate the reconstruction process. We consider these challenges as we progress through the model reconstruction process, beginning with genome assembly, and culminating in the integration of constraints to capture the impact of transcriptional regulation. We divide the reconstruction process into ten distinct steps: (1) genome assembly from sequenced reads; (2) automated structural and functional annotation; (3) phylogenetic tree-based curation of genome annotations; (4) assembly and standardization of biochemistry database; (5) genome-scale metabolic reconstruction; (6) generation of core metabolic model; (7) generation of biomass composition reaction; (8) completion of draft metabolic model; (9) curation of metabolic model; and (10) integration of regulatory constraints. Each of these ten steps is documented in detail.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE - The aim of our study was to assess the profile of a wrist monitor, the Omron Model HEM-608, compared with the indirect method for blood pressure measurement. METHODS - Our study population consisted of 100 subjects, 29 being normotensive and 71 being hypertensive. Participants had their blood pressure checked 8 times with alternate techniques, 4 by the indirect method and 4 with the Omron wrist monitor. The validation criteria used to test this device were based on the internationally recognized protocols. RESULTS - Our data showed that the Omron HEM-608 reached a classification B for systolic and A for diastolic blood pressure, according to the one protocol. The mean differences between blood pressure values obtained with each of the methods were -2.3 +7.9mmHg for systolic and 0.97+5.5mmHg for diastolic blood pressure. Therefore, we considered this type of device approved according to the criteria selected. CONCLUSION - Our study leads us to conclude that this wrist monitor is not only easy to use, but also produces results very similar to those obtained by the standard indirect method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To assess the Dixtal DX2710 automated oscillometric device used for blood pressure measurement according to the protocols of the BHS and the AAMI. METHODS: Three blood pressure measurements were taken in 94 patients (53 females 15 to 80 years). The measurements were taken randomly by 2 observers trained to measure blood pressure with a mercury column device connected with an automated device. The device was classified according to the protocols of the BHS and AAMI. RESULT: The mean of blood pressure levels obtained by the observers was 148±38/93±25 mmHg and that obtained with the device was 148±37/89±26 mmHg. Considering the differences between the measurements obtained by the observer and those obtained with the automated device according to the criteria of the BHS, the following classification was adopted: "A" for systolic pressure (69% of the differences < 5; 90% < 10; and 97% < 15 mmHg); and "B" for diastolic pressure (63% of the differences < 5; 83% < 10; and 93% < 15 mmHg). The mean and standard deviation of the differences were 0±6.27 mmHg for systolic pressure and 3.82±6.21 mmHg for diastolic pressure. CONCLUSION: The Dixtal DX2710 device was approved according to the international recommendations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Se propone desarrollar e integrar estudios sobre Modelado y Resolución de Problemas en Física que asumen como factores explicativos: características de la situación planteada, conocimiento de la persona que resuelve y proceso puesto en juego durante la resolución. Interesa comprender cómo los estudiantes acceden al conocimiento previo, qué procedimientos usan para recuperar algunos conocimientos y desechar otros, cuáles son los criterios que dan coherencia a sus decisiones, cómo se relacionan estas decisiones con algunas características de la tarea, entre otras. Todo ello con miras a estudiar relaciones causales entre las dificultades encontradas y el retraso o abandono en las carreras.Se propone organizar el trabajo en tres ejes, los dos primeros de construcción teórica y un tercero de implementación y transferencia. Se pretende.1.-Estudiar los procesos de construcción de las representaciones mentales en resolución de problemas de física, tanto en expertos como en estudiantes de diferentes niveles académicos.2.-Analizar y clasificar las inferencias que se producen durante las tareas de comprensión en resolución de problemas de física. Asociar dichas inferencias con procesos de transición entre representaciones mentales de diferente naturaleza.3.-Desarrollar materiales y diseños instruccionales en la enseñanza de la Física, fundamentado en un conocimiento de los requerimientos psicológicos de los estudiantes en diversas tareas de aprendizaje.En términos generales se plantea un enfoque interpretativo a la luz de marcos de la psicología cognitiva y de los desarrollos propios del grupo. Se trabajará con muestras intencionales de alumnos y profesores de física. Se utilizarán protocolos verbales y registros escritos producidos durante la ejecución de las tareas con el fin de identificar indicadores de comprensión, inferencias, y diferentes niveles de representación. Se prevé analizar material escrito de circulación corriente sea comercial o preparado por los docentes de las carreras involucradas.Las características del objeto de estudio y el distinto nivel de desarrollo en que se encuentran los diferentes ojetivos específicos llevan a que el abordaje contemple -según consideracion de Juni y Urbano (2006)- tanto la lógica cualitativa como la cuantitativa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Entre los factores que contribuyen a predecir el rendimiento académico se pueden destacar aquellos que reflejan capacidades cognitivas (inteligencia, por ejemplo), y aquellas diferencias individuales consideradas como no-cognitivas (rasgos de personalidad, por ejemplo). En los últimos años, también se considera al Conocimiento General (CG) como un criterio para el éxito académico (ver Ackerman, 1997), ya que se ha evidenciado que el conocimiento previo ayuda en la adquisición de nuevo conocimiento (Hambrick & Engle, 2001). Uno de los objetivos de la psicología educacional consiste en identificar las principales variables que explican el rendimiento académico, como también proponer modelos teóricos que expliquen las relaciones existentes entre estas variables. El modelo teórico PPIK (Inteligencia-como-Proceso, Personalidad, Intereses e Inteligencia-como-Conocimiento) propuesto por Ackerman (1996) propone que el conocimiento y las destrezas adquiridas en un dominio en particular son el resultado de la dedicación de recursos cognitivos que una persona realiza durante un prolongado período de tiempo. Este modelo propone que los rasgos de personalidad, intereses individuales/vocacionales y aspectos motivacionales están integrados como rasgos complejos que determinan la dirección y la intensidad de la dedicación de recursos cognitivos sobre el aprendizaje que realiza una persona (Ackerman, 2003). En nuestro medio (Córdoba, Argentina), un grupo de investigadores ha desarrollado una serie de recursos técnicos necesarios para la evaluación de algunos de los constructos propuesto por este modelo. Sin embargo, por el momento no contamos con una medida de Conocimiento General. Por lo tanto, en el presente proyecto se propone la construcción de un instrumento para medir Conocimiento General (CG), indispensable para poder contar con una herramienta que permita establecer parámetros sobre el nivel de conocimiento de la población universitaria y para en próximos trabajos poner a prueba los postulados de la teoría PPIK (Ackerman, 1996). Between the factors that contribute to predict the academic achievement, may be featured those who reflect cognitive capacities (i.g. intelligence) and those who reflect individual differences that are considered like non-cognitive (i.g. personality traits). In the last years, also the General Knowledge has been considered like a criterion for the academic successfully (see Ackerman, 1997), since it has been shown that the previous knowledge helps in the acquisition of the new knowledge (Hambrick & Engle, 2001). An interesting theoretical model that has proposed an explanation for the academic achievement, is the PPIK (intelligence like a process, interests and inteligence like knowledge) proposed by Ackerman (1996), who argues that knowledge and the acquired skills in a particular domain are the result of the dedication of cognitive resources that a person perform during a long period of time. This model proposes that personality traits, individuals interests and motivational aspects are integrated as complex traits that determine the direction and the intensity of the dedication of cognitive resources on the learning that a person make (Ackerman, 2003). In our context, (Córdoba, Argentina), a group of researcher has developed a series of necessary technical resoures for the assesment of some of the theoretical constructs proposed by this model. However, by the moment, we do not have an instrument for evaluate the General Knowledge. Therefore, this project aims the construction of an instrument to asess General Knowledge, essential to set parameters on the knowledge level of the university population and for in next works test the PPIK theory postulates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Somatic post-surgical pain is invalidating and distressing to patients and carries the risk of important complications. The anterior abdominal wall is involved in most surgical procedures in general, gynecologic, obstetric, urological, vascular and pediatric surgery. Combined multimodal strategies involving nerve blocks, opiates, and non-steroidal anti-inflammatory drugs for systemic analgesia are necessary for optimal pain modulation. Anterior abdominal wall blocks, transverse abdominal plexus block, iliohypogastric and ilioinguinal nerveblock, genitofemoral nerve block and rectus sheath block have an important role as components of multimodal analgesia for somatic intraoperative and postoperative pain control. Ultrasound visualization has improved the efficacy and safety of abdominal blocks and implemented the application in the clinical setting. For this reason, they are a very important tool for all anesthesiologists who aim to treat effectively patients’ pain. This guide provides an evidence based comprehensive and necessary overview of anatomical, anesthesiological and technical information needed to safely perform these blocks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Naturwiss., Diss., 2011

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background:Despite being recommended as a compulsory part of the school curriculum, the teaching of basic life support (BLS) has yet to be implemented in high schools in most countries.Objectives:To compare prior knowledge and degree of immediate and delayed learning between students of one public and one private high school after these students received BLS training.Methods:Thirty students from each school initially answered a questionnaire on cardiopulmonary resuscitation (CPR) and use of the automated external defibrillator (AED). They then received theoretical-practical BLS training, after which they were given two theory assessments: one immediately after the course and the other six months later.Results:The overall success rates in the prior, immediate, and delayed assessments were significantly different between groups, with better performance shown overall by private school students than by public school students: 42% ± 14% vs. 30.2% ± 12.2%, p = 0.001; 86% ± 7.8% vs. 62.4% ± 19.6%, p < 0.001; and 65% ± 12.4% vs. 45.6% ± 16%, p < 0.001, respectively. The total odds ratio of the questions showed that the private school students performed the best on all three assessments, respectively: 1.66 (CI95% 1.26-2.18), p < 0.001; 3.56 (CI95% 2.57-4.93), p < 0.001; and 2.21 (CI95% 1.69-2.89), p < 0.001.Conclusions:Before training, most students had insufficient knowledge about CPR and AED; after BLS training a significant immediate and delayed improvement in learning was observed in students, especially in private school students.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Propositionalization, Inductive Logic Programming, Multi-Relational Data Mining

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our knowledge regarding the anatomophysiology of the cardiovascular system (CVS) has progressed since the fourth millennium BC. In Egypt (3500 BC), it was believed that a set of channels are interconnected to the heart, transporting air, urine, air, blood, and the soul. One thousand years later, the heart was established as the center of the CVS by the Hippocratic Corpus in the medical school of Kos, and some of the CVS anatomical characteristics were defined. The CVS was known to transport blood via the right ventricle through veins and the pneuma via the left ventricle through arteries. Two hundred years later, in Alexandria, following the development of human anatomical dissection, Herophilus discovered that arteries were 6 times thicker than veins, and Erasistratus described the semilunar valves, emphasizing that arteries were filled with blood when ventricles were empty. Further, 200 years later, Galen demonstrated that arteries contained blood and not air. With the decline of the Roman Empire, Greco-Roman medical knowledge about the CVS was preserved in Persia, and later in Islam where, Ibn Nafis inaccurately described pulmonary circulation. The resurgence of dissection of the human body in Europe in the 14th century was associated with the revival of the knowledge pertaining to the CVS. The main findings were the description of pulmonary circulation by Servetus, the anatomical discoveries of Vesalius, the demonstration of pulmonary circulation by Colombo, and the discovery of valves in veins by Fabricius. Following these developments, Harvey described blood circulation.