899 resultados para Medical lab data


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ice-rich permafrost landscapes are sensitive to climate and environmental change due to the melt-out of ground ice during thermokarst development. Thermokarst processes in the northern Yukon Territory are currently not well-documented. Lake sediments from Herschel Island (69°36'N; 139°04'W) in the western Canadian Arctic provide a record of thermokarst lake development since the early Holocene. A 727 cm long lake sediment core was analyzed for radiographic images, magnetic susceptibility, granulometry, and biogeochemical parameters (organic carbon, nitrogen, and stable carbon isotopes). Based on eight calibrated AMS radiocarbon dates, the sediment record covers the last ~ 11,500 years and was divided into four lithostratigraphic units (A to D) reflecting different thermokarst stages. Thermokarst initiation at the study area began ~ 11.5 cal ka BP. From ~ 11.5 to 10.0 cal ka BP, lake sediments of unit A started to accumulate in an initial lake basin created by melt-out of massive ground ice and thaw subsidence. Between 10.0 and 7.0 cal ka BP (unit B) the lake basin expanded in size and depth, attributed to talik formation during the Holocene thermal maximum. Higher-than-modern summer air temperatures led to increased lake productivity and widespread terrain disturbances in the lake's catchment. Thermokarst lake development between 7.0 and 1.8 cal ka BP (unit C) was characterized by a dynamic equilibrium, where lake basin and talik steadily expanded into ambient ice-rich terrain through shoreline erosion. Once lakes become deeper than the maximum winter lake ice thickness, thermokarst lake sediments show a great preservation potential. However, site-specific geomorphic factors such as episodic bank-shore erosion or sudden drainage through thermo-erosional valleys or coastal erosion breaching lake basins can disrupt continuous deposition. A hiatus in the record from 1.8 to 0.9 cal ka BP in Lake Herschel likely resulted from lake drainage or allochthonous slumping due to collapsing shore lines before continuous sedimentation of unit D recommenced during the last 900 years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Glacial/interglacial changes in Southern Ocean's air-sea gas exchange have been considered as important mechanisms contributing to the glacial/interglacial variability in atmospheric CO2. Hence, understanding past variability in Southern Ocean intermediate- to deep-water chemistry and circulation is fundamental to constrain the role of these processes on modulating glacial/interglacial changes in the global carbon cycle. Our study focused on the glacial/interglacial variability in the vertical extent of southwest Pacific Antarctic Intermediate Water (AAIW). We compared carbon and oxygen isotope records from epibenthic foraminifera of sediment cores bathed in modern AAIW and Upper Circumpolar Deep Water (UCDW; 943 - 2066 m water depth) to monitor changes in water mass circulation spanning the past 350,000 years. We propose that pronounced freshwater input by melting sea ice into the glacial AAIW significantly hampered the downward expansion of southwest Pacific AAIW, consistent with climate model results for the Last Glacial Maximum. This process led to a pronounced upward displacement of the AAIW-UCDW interface during colder climate conditions and therefore to an expansion of the glacial carbon pool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A small Positron Emission Tomography demonstrator based on LYSO slabs and Silicon Photomultiplier matrices is under construction at the University and INFN of Pisa. In this paper we present the characterization results of the read-out electronics and of the detection system. Two SiPM matrices, composed by 8 × 8 SiPM pixels, 1.5 mm pitch, have been coupled one to one to a LYSO crystals array. Custom Front-End ASICs were used to read the 64 channels of each matrix. Data from each Front-End were multiplexed and sent to a DAQ board for the digital conversion; a motherboard collects the data and communicates with a host computer through a USB port. Specific tests were carried out on the system in order to assess its performance. Futhermore we have measured some of the most important parameters of the system for PET application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Expert systems are built from knowledge traditionally elicited from the human expert. It is precisely knowledge elicitation from the expert that is the bottleneck in expert system construction. On the other hand, a data mining system, which automatically extracts knowledge, needs expert guidance on the successive decisions to be made in each of the system phases. In this context, expert knowledge and data mining discovered knowledge can cooperate, maximizing their individual capabilities: data mining discovered knowledge can be used as a complementary source of knowledge for the expert system, whereas expert knowledge can be used to guide the data mining process. This article summarizes different examples of systems where there is cooperation between expert knowledge and data mining discovered knowledge and reports our experience of such cooperation gathered from a medical diagnosis project called Intelligent Interpretation of Isokinetics Data, which we developed. From that experience, a series of lessons were learned throughout project development. Some of these lessons are generally applicable and others pertain exclusively to certain project types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Images acquired during free breathing using first-pass gadolinium-enhanced myocardial perfusion magnetic resonance imaging (MRI) exhibit a quasiperiodic motion pattern that needs to be compensated for if a further automatic analysis of the perfusion is to be executed. In this work, we present a method to compensate this movement by combining independent component analysis (ICA) and image registration: First, we use ICA and a time?frequency analysis to identify the motion and separate it from the intensity change induced by the contrast agent. Then, synthetic reference images are created by recombining all the independent components but the one related to the motion. Therefore, the resulting image series does not exhibit motion and its images have intensities similar to those of their original counterparts. Motion compensation is then achieved by using a multi-pass image registration procedure. We tested our method on 39 image series acquired from 13 patients, covering the basal, mid and apical areas of the left heart ventricle and consisting of 58 perfusion images each. We validated our method by comparing manually tracked intensity profiles of the myocardial sections to automatically generated ones before and after registration of 13 patient data sets (39 distinct slices). We compared linear, non-linear, and combined ICA based registration approaches and previously published motion compensation schemes. Considering run-time and accuracy, a two-step ICA based motion compensation scheme that first optimizes a translation and then for non-linear transformation performed best and achieves registration of the whole series in 32 ± 12 s on a recent workstation. The proposed scheme improves the Pearsons correlation coefficient between manually and automatically obtained time?intensity curves from .84 ± .19 before registration to .96 ± .06 after registration

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Over the last years, the number of available informatics resources in medicine has grown exponentially. While specific inventories of such resources have already begun to be developed for Bioinformatics (BI), comparable inventories are as yet not available for Medical Informatics (MI) field, so that locating and accessing them currently remains a hard and time-consuming task. Description. We have created a repository of MI resources from the scientific literature, providing free access to its contents through a web-based service. Relevant information describing the resources is automatically extracted from manuscripts published in top-ranked MI journals. We used a pattern matching approach to detect the resources? names and their main features. Detected resources are classified according to three different criteria: functionality, resource type and domain. To facilitate these tasks, we have built three different taxonomies by following a novel approach based on folksonomies and social tagging. We adopted the terminology most frequently used by MI researchers in their publications to create the concepts and hierarchical relationships belonging to the taxonomies. The classification algorithm identifies the categories associated to resources and annotates them accordingly. The database is then populated with this data after manual curation and validation. Conclusions. We have created an online repository of MI resources to assist researchers in locating and accessing the most suitable resources to perform specific tasks. The database contained 282 resources at the time of writing. We are continuing to expand the number of available resources by taking into account further publications as well as suggestions from users and resource developers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important objective of the INTEGRATE project1 is to build tools that support the efficient execution of post-genomic multi-centric clinical trials in breast cancer, which includes the automatic assessment of the eligibility of patients for available trials. The population suited to be enrolled in a trial is described by a set of free-text eligibility criteria that are both syntactically and semantically complex. At the same time, the assessment of the eligibility of a patient for a trial requires the (machineprocessable) understanding of the semantics of the eligibility criteria in order to further evaluate if the patient data available for example in the hospital EHR satisfies these criteria. This paper presents an analysis of the semantics of the clinical trial eligibility criteria based on relevant medical ontologies in the clinical research domain: SNOMED-CT, LOINC, MedDRA. We detect subsets of these widely-adopted ontologies that characterize the semantics of the eligibility criteria of trials in various clinical domains and compare these sets. Next, we evaluate the occurrence frequency of the concepts in the concrete case of breast cancer (which is our first application domain) in order to provide meaningful priorities for the task of binding/mapping these ontology concepts to the actual patient data. We further assess the effort required to extend our approach to new domains in terms of additional semantic mappings that need to be developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on an innovative approach that aims to reduce information management costs in data-intensive and cognitively-complex biomedical environments. Recognizing the importance of prominent high-performance computing paradigms and large data processing technologies as well as collaboration support systems to remedy data-intensive issues, it adopts a hybrid approach by building on the synergy of these technologies. The proposed approach provides innovative Web-based workbenches that integrate and orchestrate a set of interoperable services that reduce the data-intensiveness and complexity overload at critical decision points to a manageable level, thus permitting stakeholders to be more productive and concentrate on creative activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtual reality (VR) techniques to understand and obtain conclusions of data in an easy way are being used by the scientific community. However, these techniques are not used frequently for analyzing large amounts of data in life sciences, particularly in genomics, due to the high complexity of data (curse of dimensionality). Nevertheless, new approaches that allow to bring out the real important data characteristics, arise the possibility of constructing VR spaces to visually understand the intrinsic nature of data. It is well known the benefits of representing high dimensional data in tridimensional spaces by means of dimensionality reduction and transformation techniques, complemented with a strong component of interaction methods. Thus, a novel framework, designed for helping to visualize and interact with data about diseases, is presented. In this paper, the framework is applied to the Van't Veer breast cancer dataset is used, while oncologists from La Paz Hospital (Madrid) are interacting with the obtained results. That is to say a first attempt to generate a visually tangible model of breast cancer disease in order to support the experience of oncologists is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Gray scale images make the bulk of data in bio-medical image analysis, and hence, the main focus of many image processing tasks lies in the processing of these monochrome images. With ever improving acquisition devices, spatial and temporal image resolution increases, and data sets become very large. Various image processing frameworks exists that make the development of new algorithms easy by using high level programming languages or visual programming. These frameworks are also accessable to researchers that have no background or little in software development because they take care of otherwise complex tasks. Specifically, the management of working memory is taken care of automatically, usually at the price of requiring more it. As a result, processing large data sets with these tools becomes increasingly difficult on work station class computers. One alternative to using these high level processing tools is the development of new algorithms in a languages like C++, that gives the developer full control over how memory is handled, but the resulting workflow for the prototyping of new algorithms is rather time intensive, and also not appropriate for a researcher with little or no knowledge in software development. Another alternative is in using command line tools that run image processing tasks, use the hard disk to store intermediate results, and provide automation by using shell scripts. Although not as convenient as, e.g. visual programming, this approach is still accessable to researchers without a background in computer science. However, only few tools exist that provide this kind of processing interface, they are usually quite task specific, and don’t provide an clear approach when one wants to shape a new command line tool from a prototype shell script. Results The proposed framework, MIA, provides a combination of command line tools, plug-ins, and libraries that make it possible to run image processing tasks interactively in a command shell and to prototype by using the according shell scripting language. Since the hard disk becomes the temporal storage memory management is usually a non-issue in the prototyping phase. By using string-based descriptions for filters, optimizers, and the likes, the transition from shell scripts to full fledged programs implemented in C++ is also made easy. In addition, its design based on atomic plug-ins and single tasks command line tools makes it easy to extend MIA, usually without the requirement to touch or recompile existing code. Conclusion In this article, we describe the general design of MIA, a general purpouse framework for gray scale image processing. We demonstrated the applicability of the software with example applications from three different research scenarios, namely motion compensation in myocardial perfusion imaging, the processing of high resolution image data that arises in virtual anthropology, and retrospective analysis of treatment outcome in orthognathic surgery. With MIA prototyping algorithms by using shell scripts that combine small, single-task command line tools is a viable alternative to the use of high level languages, an approach that is especially useful when large data sets need to be processed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cualquier estructura vibra según unas frecuencias propias definidas por sus parámetros modales (frecuencias naturales, amortiguamientos y formas modales). A través de las mediciones de la vibración en puntos clave de la estructura, los parámetros modales pueden ser estimados. En estructuras civiles, es difícil excitar una estructura de manera controlada, por lo tanto, las técnicas que implican la estimación de los parámetros modales sólo registrando su respuesta son de vital importancia para este tipo de estructuras. Esta técnica se conoce como Análisis Modal Operacional (OMA). La técnica del OMA no necesita excitar artificialmente la estructura, atendiendo únicamente a su comportamiento en servicio. La motivación para llevar a cabo pruebas de OMA surge en el campo de la Ingeniería Civil, debido a que excitar artificialmente con éxito grandes estructuras no sólo resulta difícil y costoso, sino que puede incluso dañarse la estructura. Su importancia reside en que el comportamiento global de una estructura está directamente relacionado con sus parámetros modales, y cualquier variación de rigidez, masa o condiciones de apoyo, aunque sean locales, quedan reflejadas en los parámetros modales. Por lo tanto, esta identificación puede integrarse en un sistema de vigilancia de la integridad estructural. La principal dificultad para el uso de los parámetros modales estimados mediante OMA son las incertidumbres asociadas a este proceso de estimación. Existen incertidumbres en el valor de los parámetros modales asociadas al proceso de cálculo (internos) y también asociadas a la influencia de los factores ambientales (externas), como es la temperatura. Este Trabajo Fin de Máster analiza estas dos fuentes de incertidumbre. Es decir, en primer lugar, para una estructura de laboratorio, se estudian y cuantifican las incertidumbres asociadas al programa de OMA utilizado. En segundo lugar, para una estructura en servicio (una pasarela de banda tesa), se estudian tanto el efecto del programa OMA como la influencia del factor ambiental en la estimación de los parámetros modales. Más concretamente, se ha propuesto un método para hacer un seguimiento de las frecuencias naturales de un mismo modo. Este método incluye un modelo de regresión lineal múltiple que permite eliminar la influencia de estos agentes externos. A structure vibrates according to some of its vibration modes, defined by their modal parameters (natural frequencies, damping ratios and modal shapes). Through the measurements of the vibration at key points of the structure, the modal parameters can be estimated. In civil engineering structures, it is difficult to excite structures in a controlled manner, thus, techniques involving output-only modal estimation are of vital importance for these structure. This techniques are known as Operational Modal Analysis (OMA). The OMA technique does not need to excite artificially the structure, this considers its behavior in service only. The motivation for carrying out OMA tests arises in the area of Civil Engineering, because successfully artificially excite large structures is difficult and expensive. It also may even damage the structure. The main goal is that the global behavior of a structure is directly related to their modal parameters, and any variation of stiffness, mass or support conditions, although it is local, is also reflected in the modal parameters. Therefore, this identification may be within a Structural Health Monitoring system. The main difficulty for using the modal parameters estimated by an OMA is the uncertainties associated to this estimation process. Thus, there are uncertainties in the value of the modal parameters associated to the computing process (internal) and the influence of environmental factors (external), such as the temperature. This Master’s Thesis analyzes these two sources of uncertainties. That is, firstly, for a lab structure, the uncertainties associated to the OMA program used are studied and quantified. Secondly, for an in-service structure (a stress-ribbon footbridge), both the effect of the OMA program and the influence of environmental factor on the modal parameters estimation are studied. More concretely, a method to track natural frequencies of the same mode has been proposed. This method includes a multiple linear regression model that allows to remove the influence of these external agents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While a number of virtual data-gloves have been used in stroke, there is little evidence about their use in spinal cord injury (SCI). A pilot clinical experience with nine SCI subjects was performed comparing two groups: one carried out a virtual rehabilitation training based on the use of a data glove, CyberTouch combined with traditional rehabilitation, during 30 minutes a day twice a week along two weeks; while the other made only conventional rehabilitation. Furthermore, two functional indexes were developed in order to assess the patient’s performance of the sessions: normalized trajectory lengths and repeatability. While differences between groups were not statistically significant, the data-glove group seemed to obtain better results in the muscle balance and functional parameters, and in the dexterity, coordination and fine grip tests. Related to the indexes that we implemented, normalized trajectory lengths and repeatability, every patient showed an improvement in at least one of the indexes, either along Y-axis trajectory or Z-axis trajectory. This study might be a step in investigating new ways of treatments and objective measures in order to obtain more accurate data about the patient’s evolution, allowing the clinicians to develop rehabilitation treatments, adapted to the abilities and needs of the patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-world experimentation facilities accelerate the development of Future Internet technologies and services, advance the market for smart infrastructures, and increase the effectiveness of business processes through the Internet. The federation of facilities fosters the experimentation and innovation with larger and more powerful environment, increases the number and variety of the offered services and brings forth possibilities for new experimentation scenarios. This paper introduces a management solution for cloud federation that automates service provisioning to the largest possible extent, relieves the developers from time-consuming configuration settings, and caters for real-time information of all information related to the whole lifecycle of the provisioned services. This is achieved by proposing solutions to achieve the seamless deployment of services across the federation and ability of services to span across different infrastructures of the federation, as well as monitoring of the resources and data which can be aggregated with a common structure, offered as an open ecosystem for innovation at the developers' disposal. This solution consists of several federation management tools and components that are part of the work on Cloud Federation conducted within XIFI project to build the federation of cloud infrastructures for the Future Internet Lab (FIWARE Lab). We present the design and implementation of the solution-concerned FIWARE Lab management tools and components that are deployed within a federation of 17 cloud infrastructures distributed across Europe.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent commentaries have proposed the advantages of using open exchange of data and informatics resources for improving health-related policies and patient care in Africa. Yet, in many African regions, both private medical and public health information systems are still unaffordable. Open exchange over the social Web 2.0 could encourage more altruistic support of medical initiatives. We have carried out some experiments to demonstrate the feasibility of using this approach to disseminate open data and informatics resources in Africa. After the experiments we developed the AFRICA BUILD Portal, the first Social Network for African biomedical researchers. Through the AFRICA BUILD Portal users can access in a transparent way to several resources. Currently, over 600 researchers are using distributed and open resources through this platform committed to low connections.