960 resultados para Vein Extension
Resumo:
Over the last few decades, the ever-increasing output of scientific publications has led to new challenges to keep up to date with the literature. In the biomedical area, this growth has introduced new requirements for professionals, e.g., physicians, who have to locate the exact papers that they need for their clinical and research work amongst a huge number of publications. Against this backdrop, novel information retrieval methods are even more necessary. While web search engines are widespread in many areas, facilitating access to all kinds of information, additional tools are required to automatically link information retrieved from these engines to specific biomedical applications. In the case of clinical environments, this also means considering aspects such as patient data security and confidentiality or structured contents, e.g., electronic health records (EHRs). In this scenario, we have developed a new tool to facilitate query building to retrieve scientific literature related to EHRs. Results: We have developed CDAPubMed, an open-source web browser extension to integrate EHR features in biomedical literature retrieval approaches. Clinical users can use CDAPubMed to: (i) load patient clinical documents, i.e., EHRs based on the Health Level 7-Clinical Document Architecture Standard (HL7-CDA), (ii) identify relevant terms for scientific literature search in these documents, i.e., Medical Subject Headings (MeSH), automatically driven by the CDAPubMed configuration, which advanced users can optimize to adapt to each specific situation, and (iii) generate and launch literature search queries to a major search engine, i.e., PubMed, to retrieve citations related to the EHR under examination. Conclusions: CDAPubMed is a platform-independent tool designed to facilitate literature searching using keywords contained in specific EHRs. CDAPubMed is visually integrated, as an extension of a widespread web browser, within the standard PubMed interface. It has been tested on a public dataset of HL7-CDA documents, returning significantly fewer citations since queries are focused on characteristics identified within the EHR. For instance, compared with more than 200,000 citations retrieved by breast neoplasm, fewer than ten citations were retrieved when ten patient features were added using CDAPubMed. This is an open source tool that can be freely used for non-profit purposes and integrated with other existing systems.
Resumo:
Various researchers have developed models of conventional H2O–LiBr absorption machines with the aim of predicting their performance. In this paper, the methodology of characteristic equations developed by Hellmann et al. (1998) is applied. This model is able to represent the capacity of single effect absorption chillers and heat pumps by means of simple algebraic equations. An extended characteristic equation based on a characteristic temperature difference has been obtained, considering the facility features. As a result, it is concluded that for adiabatic absorbers a subcooling temperature must be specified. The effect of evaporator overflow has been characterized. Its influence on cooling capacity has been included in the extended characteristic equation. Taking into account the particular design and operation features, a good agreement between experimental performance data and those obtained through the extended characteristic equation has been achieved at off-design operation. This allows its use for simulation and control purposes.
Resumo:
We propose a fuzzy approach to deal with risk analysis for information systems. We extend MAGERIT methodology that valuates the asset dependencies to a fuzzy framework adding fuzzy linguistic terms to valuate the different elements (terminal asset values, asset dependencies as well as the probability of threats and the resulting asset degradation) in risk analysis. Computations are based on the trapezoidal fuzzy numbers associated with these linguistic terms and, finally, the results of these operations are translated into a linguistic term by means of a similarity function.
Resumo:
Se investiga la distribución espacial de contenidos metálicos analizados sobre testigos de sondeos obtenidos en las campañas de exploración de la Veta Pallancata. Se aplica el análisis factorial a dicha distribución y a los cocientes de los valores metálicos, discriminando los que están correlacionados con la mineralización argentífera y que sirven como guías de exploración para hallar zonas de potenciales reservas por sus gradientes de variación.Abstract:The metal distribution in a vein may show the paths of hydrothermal fluid flow at the time of mineralization. Such information may assist for in-fill drilling. The Pallancata Vein has been intersected by 52 drill holes, whose cores were sampled and analysed, and the results plotted to examine the mineralisation trends. The spatial distribution of the ore is observed from the logAg/logPb ratio distribution. Au is in this case closely related to Ag (electrum and uytenbogaardtite, Ag3AuS2 ). The Au grade shows the same spatial distribution as the Ag grade. The logAg/logPb ratio distribution also suggests possible ore to be expected at deeper locations. Shallow supergene Ag enrichment was also observed.
Resumo:
Sign.: A17
Resumo:
Contiene: Villancicos que se cantaron el dia 20 de mayo de 1696 en el ...
Resumo:
In an early paper Herbert Mohring (J. Poi Et on , 49 (1961)) presented a model for land rent distribution yielding the well-known result that the price of land must fall with the distance from the city center to offset transportation costs. Our paper is an extension of Mohring's model in which we relax some of his drastic simplifying assumptions. This extended model has been incorporated in a method for economic evaluation of city master plans which has been applied to a Swedish city. In this method the interdependence among housing, heating, and transportation, the dura-bility of urban structures, and the uncertainty of future demand are explicitly considered within a cost-benefit approach. Some empirical results from this pilot study concerning land rent distributions are also presented here.
Resumo:
Reciprocal frame structures, formed by a set of self-supported elements in a closed circuit, have long been used since antiquity to cover large spans with small elements. The roof structure of the Euskalduna conference centre and concert hall extension in Bilbao, covering an irregu- lar geometry of 3000 m2 with a maximum span of 45 m, presented an interesting opportunity to revisit the concept and to apply these classical systems. Furthermore, its analysis and develop- ment led to an interesting discussion on reciprocal frames. They showed great sensitivity of these systems to the local modification of a particular element, establishment of irregular load paths, mobilisation of almost the entire sys- tem when locally applying a punctual load and, finally, its large deformability. Besides, reciprocal frames present particular construction complexities and possibilities due to the moderate length of the structural elements, the predominance of shear-only connec- tions and the necessity of the entire system to be completely erected to guarantee its stability. Euskalduna extension, completed in 2012, is one of the largest and a very par- ticular case of irregular reciprocal frame structures built in the world. It shows the formal possibilities and potentiality of reciprocal frames to respond to free and irregular geometries.
Resumo:
Inicial grab. xil
Resumo:
Linked Data assets (RDF triples, graphs, datasets, mappings...) can be object of protection by the intellectual property law, the database law or its access or publication be restricted by other legal reasons (personal data pro- tection, security reasons, etc.). Publishing a rights expression along with the digital asset, allows the rightsholder waiving some or all of the IP and database rights (leaving the work in the public domain), permitting some operations if certain conditions are satisfied (like giving attribution to the author) or simply reminding the audience that some rights are reserved.
Resumo:
To correctly evaluate semantic technologies and to obtain results that can be easily integrated, we need to put evaluations under the scope of a unique software quality model. This paper presents SemQuaRE, a quality model for semantic technologies. SemQuaRE is based on the SQuaRE standard and describes a set of quality characteristics specific to semantic technologies and the quality measures that can be used for their measurement. It also provides detailed formulas for the calculation of such measures. The paper shows that SemQuaRE is complete with respect to current evaluation trends and that it has been successfully applied in practice.
Resumo:
La tomografía axial computerizada (TAC) es la modalidad de imagen médica preferente para el estudio de enfermedades pulmonares y el análisis de su vasculatura. La segmentación general de vasos en pulmón ha sido abordada en profundidad a lo largo de los últimos años por la comunidad científica que trabaja en el campo de procesamiento de imagen; sin embargo, la diferenciación entre irrigaciones arterial y venosa es aún un problema abierto. De hecho, la separación automática de arterias y venas está considerado como uno de los grandes retos futuros del procesamiento de imágenes biomédicas. La segmentación arteria-vena (AV) permitiría el estudio de ambas irrigaciones por separado, lo cual tendría importantes consecuencias en diferentes escenarios médicos y múltiples enfermedades pulmonares o estados patológicos. Características como la densidad, geometría, topología y tamaño de los vasos sanguíneos podrían ser analizados en enfermedades que conllevan remodelación de la vasculatura pulmonar, haciendo incluso posible el descubrimiento de nuevos biomarcadores específicos que aún hoy en dípermanecen ocultos. Esta diferenciación entre arterias y venas también podría ayudar a la mejora y el desarrollo de métodos de procesamiento de las distintas estructuras pulmonares. Sin embargo, el estudio del efecto de las enfermedades en los árboles arterial y venoso ha sido inviable hasta ahora a pesar de su indudable utilidad. La extrema complejidad de los árboles vasculares del pulmón hace inabordable una separación manual de ambas estructuras en un tiempo realista, fomentando aún más la necesidad de diseñar herramientas automáticas o semiautomáticas para tal objetivo. Pero la ausencia de casos correctamente segmentados y etiquetados conlleva múltiples limitaciones en el desarrollo de sistemas de separación AV, en los cuales son necesarias imágenes de referencia tanto para entrenar como para validar los algoritmos. Por ello, el diseño de imágenes sintéticas de TAC pulmonar podría superar estas dificultades ofreciendo la posibilidad de acceso a una base de datos de casos pseudoreales bajo un entorno restringido y controlado donde cada parte de la imagen (incluyendo arterias y venas) está unívocamente diferenciada. En esta Tesis Doctoral abordamos ambos problemas, los cuales están fuertemente interrelacionados. Primero se describe el diseño de una estrategia para generar, automáticamente, fantomas computacionales de TAC de pulmón en humanos. Partiendo de conocimientos a priori, tanto biológicos como de características de imagen de CT, acerca de la topología y relación entre las distintas estructuras pulmonares, el sistema desarrollado es capaz de generar vías aéreas, arterias y venas pulmonares sintéticas usando métodos de crecimiento iterativo, que posteriormente se unen para formar un pulmón simulado con características realistas. Estos casos sintéticos, junto a imágenes reales de TAC sin contraste, han sido usados en el desarrollo de un método completamente automático de segmentación/separación AV. La estrategia comprende una primera extracción genérica de vasos pulmonares usando partículas espacio-escala, y una posterior clasificación AV de tales partículas mediante el uso de Graph-Cuts (GC) basados en la similitud con arteria o vena (obtenida con algoritmos de aprendizaje automático) y la inclusión de información de conectividad entre partículas. La validación de los fantomas pulmonares se ha llevado a cabo mediante inspección visual y medidas cuantitativas relacionadas con las distribuciones de intensidad, dispersión de estructuras y relación entre arterias y vías aéreas, los cuales muestran una buena correspondencia entre los pulmones reales y los generados sintéticamente. La evaluación del algoritmo de segmentación AV está basada en distintas estrategias de comprobación de la exactitud en la clasificación de vasos, las cuales revelan una adecuada diferenciación entre arterias y venas tanto en los casos reales como en los sintéticos, abriendo así un amplio abanico de posibilidades en el estudio clínico de enfermedades cardiopulmonares y en el desarrollo de metodologías y nuevos algoritmos para el análisis de imágenes pulmonares. ABSTRACT Computed tomography (CT) is the reference image modality for the study of lung diseases and pulmonary vasculature. Lung vessel segmentation has been widely explored by the biomedical image processing community, however, differentiation of arterial from venous irrigations is still an open problem. Indeed, automatic separation of arterial and venous trees has been considered during last years as one of the main future challenges in the field. Artery-Vein (AV) segmentation would be useful in different medical scenarios and multiple pulmonary diseases or pathological states, allowing the study of arterial and venous irrigations separately. Features such as density, geometry, topology and size of vessels could be analyzed in diseases that imply vasculature remodeling, making even possible the discovery of new specific biomarkers that remain hidden nowadays. Differentiation between arteries and veins could also enhance or improve methods processing pulmonary structures. Nevertheless, AV segmentation has been unfeasible until now in clinical routine despite its objective usefulness. The huge complexity of pulmonary vascular trees makes a manual segmentation of both structures unfeasible in realistic time, encouraging the design of automatic or semiautomatic tools to perform the task. However, this lack of proper labeled cases seriously limits in the development of AV segmentation systems, where reference standards are necessary in both algorithm training and validation stages. For that reason, the design of synthetic CT images of the lung could overcome these difficulties by providing a database of pseudorealistic cases in a constrained and controlled scenario where each part of the image (including arteries and veins) is differentiated unequivocally. In this Ph.D. Thesis we address both interrelated problems. First, the design of a complete framework to automatically generate computational CT phantoms of the human lung is described. Starting from biological and imagebased knowledge about the topology and relationships between structures, the system is able to generate synthetic pulmonary arteries, veins, and airways using iterative growth methods that can be merged into a final simulated lung with realistic features. These synthetic cases, together with labeled real CT datasets, have been used as reference for the development of a fully automatic pulmonary AV segmentation/separation method. The approach comprises a vessel extraction stage using scale-space particles and their posterior artery-vein classification using Graph-Cuts (GC) based on arterial/venous similarity scores obtained with a Machine Learning (ML) pre-classification step and particle connectivity information. Validation of pulmonary phantoms from visual examination and quantitative measurements of intensity distributions, dispersion of structures and relationships between pulmonary air and blood flow systems, show good correspondence between real and synthetic lungs. The evaluation of the Artery-Vein (AV) segmentation algorithm, based on different strategies to assess the accuracy of vessel particles classification, reveal accurate differentiation between arteries and vein in both real and synthetic cases that open a huge range of possibilities in the clinical study of cardiopulmonary diseases and the development of methodological approaches for the analysis of pulmonary images.
Resumo:
This is LUCER-MC Report #03-11 Published by Lincoln University Cooperative Extension and Research (LUCER) Media Center; 900 Chestnut Street, 301 Allen Hall; Jefferson City, MO 65101.
Resumo:
This is LUCER-MC Report #04-12 Published by Lincoln University Cooperative Extension and Research (LUCER) Media Center; 900 Chestnut Street, 301 Allen Hall; Jefferson City, MO 65101.
Resumo:
Volume 1, Issue 3