9 resultados para Medicine--Data processing
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Methodological evaluation of the proteomic analysis of cardiovascular-tissue material has been performed with a special emphasis on establishing examinations that allow reliable quantitative analysis of silver-stained readouts. Reliability, reproducibility, robustness and linearity were addressed and clarified. In addition, several types of normalization procedures were evaluated and new approaches are proposed. It has been found that the silver-stained readout offers a convenient approach for quantitation if a linear range for gel loading is defined. In addition, a broad range of a 10-fold input (loading 20-200 microg per gel) fulfills the linearity criteria, although at the lowest input (20 microg) a portion of protein species will remain undetected. The method is reliable and reproducible within a range of 65-200 microg input. The normalization procedure using the sum of all spot intensities from a silver-stained 2D pattern has been shown to be less reliable than other approaches, namely, normalization through median or through involvement of interquartile range. A special refinement of the normalization through virtual segmentation of pattern, and calculation of normalization factor for each stratum provides highly satisfactory results. The presented results not only provide evidence for the usefulness of silver-stained gels for quantitative evaluation, but they are directly applicable to the research endeavor of monitoring alterations in cardiovascular pathophysiology.
Resumo:
The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).
Resumo:
INTRODUCTION: A multi-centre study has been conducted, during 2005, by means of a questionnaire posted on the Italian Society of Emergency Medicine (SIMEU) web page. Our intention was to carry out an organisational and functional analysis of Italian Emergency Departments (ED) in order to pick out some macro-indicators of the activities performed. Participation was good, in that 69 ED (3,285,440 admissions to emergency services) responded to the questionnaire. METHODS: The study was based on 18 questions: 3 regarding the personnel of the ED, 2 regarding organisational and functional aspects, 5 on the activity of the ED, 7 on triage and 1 on the assessment of the quality perceived by the users of the ED. RESULTS AND CONCLUSION: The replies revealed that 91.30% of the ED were equipped with data-processing software, which, in 96.83% of cases, tracked the entire itinerary of the patient. About 48,000 patients/year used the ED: 76.72% were discharged and 18.31% were hospitalised. Observation Units were active in 81.16% of the ED examined. Triage programmes were in place in 92.75% of ED: in 75.81% of these, triage was performed throughout the entire itinerary of the patient; in 16.13% it was performed only symptom-based, and in 8.06% only on-call. Of the patients arriving at the ED, 24.19% were assigned a non-urgent triage code, 60.01% a urgent code, 14.30% a emergent code and 1.49% a life-threatening code. Waiting times were: 52.39 min for non-urgent patients, 40.26 min for urgent, 12.08 for emergent, and 1.19 for life-threatening patients.
Resumo:
BACKGROUND: To investigate if non-rigid image-registration reduces motion artifacts in triggered and non-triggered diffusion tensor imaging (DTI) of native kidneys. A secondary aim was to determine, if improvements through registration allow for omitting respiratory-triggering. METHODS: Twenty volunteers underwent coronal DTI of the kidneys with nine b-values (10-700 s/mm2 ) at 3 Tesla. Image-registration was performed using a multimodal nonrigid registration algorithm. Data processing yielded the apparent diffusion coefficient (ADC), the contribution of perfusion (FP ), and the fractional anisotropy (FA). For comparison of the data stability, the root mean square error (RMSE) of the fitting and the standard deviations within the regions of interest (SDROI ) were evaluated. RESULTS: RMSEs decreased significantly after registration for triggered and also for non-triggered scans (P < 0.05). SDROI for ADC, FA, and FP were significantly lower after registration in both medulla and cortex of triggered scans (P < 0.01). Similarly the SDROI of FA and FP decreased significantly in non-triggered scans after registration (P < 0.05). RMSEs were significantly lower in triggered than in non-triggered scans, both with and without registration (P < 0.05). CONCLUSION: Respiratory motion correction by registration of individual echo-planar images leads to clearly reduced signal variations in renal DTI for both triggered and particularly non-triggered scans. Secondarily, the results suggest that respiratory-triggering still seems advantageous.J. Magn. Reson. Imaging 2014. (c) 2014 Wiley Periodicals, Inc.
Resumo:
The article proposes granular computing as a theoretical, formal and methodological basis for the newly emerging research field of human–data interaction (HDI). We argue that the ability to represent and reason with information granules is a prerequisite for data legibility. As such, it allows for extending the research agenda of HDI to encompass the topic of collective intelligence amplification, which is seen as an opportunity of today’s increasingly pervasive computing environments. As an example of collective intelligence amplification in HDI, we introduce a collaborative urban planning use case in a cognitive city environment and show how an iterative process of user input and human-oriented automated data processing can support collective decision making. As a basis for automated human-oriented data processing, we use the spatial granular calculus of granular geometry.
Resumo:
Navigation of deep space probes is most commonly operated using the spacecraft Doppler tracking technique. Orbital parameters are determined from a series of repeated measurements of the frequency shift of a microwave carrier over a given integration time. Currently, both ESA and NASA operate antennas at several sites around the world to ensure the tracking of deep space probes. Just a small number of software packages are nowadays used to process Doppler observations. The Astronomical Institute of the University of Bern (AIUB) has recently started the development of Doppler data processing capabilities within the Bernese GNSS Software. This software has been extensively used for Precise Orbit Determination of Earth orbiting satellites using GPS data collected by on-board receivers and for subsequent determination of the Earth gravity field. In this paper, we present the currently achieved status of the Doppler data modeling and orbit determination capabilities in the Bernese GNSS Software using GRAIL data. In particular we will focus on the implemented orbit determination procedure used for the combined analysis of Doppler and intersatellite Ka-band data. We show that even at this earlier stage of the development we can achieve an accuracy of few mHz on two-way S-band Doppler observation and of 2 µm/s on KBRR data from the GRAIL primary mission phase.