9 resultados para Fluid mechanics - Data processing
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Methodological evaluation of the proteomic analysis of cardiovascular-tissue material has been performed with a special emphasis on establishing examinations that allow reliable quantitative analysis of silver-stained readouts. Reliability, reproducibility, robustness and linearity were addressed and clarified. In addition, several types of normalization procedures were evaluated and new approaches are proposed. It has been found that the silver-stained readout offers a convenient approach for quantitation if a linear range for gel loading is defined. In addition, a broad range of a 10-fold input (loading 20-200 microg per gel) fulfills the linearity criteria, although at the lowest input (20 microg) a portion of protein species will remain undetected. The method is reliable and reproducible within a range of 65-200 microg input. The normalization procedure using the sum of all spot intensities from a silver-stained 2D pattern has been shown to be less reliable than other approaches, namely, normalization through median or through involvement of interquartile range. A special refinement of the normalization through virtual segmentation of pattern, and calculation of normalization factor for each stratum provides highly satisfactory results. The presented results not only provide evidence for the usefulness of silver-stained gels for quantitative evaluation, but they are directly applicable to the research endeavor of monitoring alterations in cardiovascular pathophysiology.
Resumo:
This paper reviews the methods, benefits and challenges associated with the adoption and translation of computational fluid dynamics (CFD) modelling within cardiovascular medicine. CFD, a specialist area of mathematics and a branch of fluid mechanics, is used routinely in a diverse range of safety-critical engineering systems, which increasingly is being applied to the cardiovascular system. By facilitating rapid, economical, low-risk prototyping, CFD modelling has already revolutionised research and development of devices such as stents, valve prostheses, and ventricular assist devices. Combined with cardiovascular imaging, CFD simulation enables detailed characterisation of complex physiological pressure and flow fields and the computation of metrics which cannot be directly measured, for example, wall shear stress. CFD models are now being translated into clinical tools for physicians to use across the spectrum of coronary, valvular, congenital, myocardial and peripheral vascular diseases. CFD modelling is apposite for minimally-invasive patient assessment. Patient-specific (incorporating data unique to the individual) and multi-scale (combining models of different length- and time-scales) modelling enables individualised risk prediction and virtual treatment planning. This represents a significant departure from traditional dependence upon registry-based, population-averaged data. Model integration is progressively moving towards 'digital patient' or 'virtual physiological human' representations. When combined with population-scale numerical models, these models have the potential to reduce the cost, time and risk associated with clinical trials. The adoption of CFD modelling signals a new era in cardiovascular medicine. While potentially highly beneficial, a number of academic and commercial groups are addressing the associated methodological, regulatory, education- and service-related challenges.
Resumo:
Several techniques have been proposed to exploit GNSS-derived kinematic orbit information for the determination of long-wavelength gravity field features. These methods include the (i) celestial mechanics approach, (ii) short-arc approach, (iii) point-wise acceleration approach, (iv) averaged acceleration approach, and (v) energy balance approach. Although there is a general consensus that—except for energy balance—these methods theoretically provide equivalent results, real data gravity field solutions from kinematic orbit analysis have never been evaluated against each other within a consistent data processing environment. This contribution strives to close this gap. Target consistency criteria for our study are the input data sets, period of investigation, spherical harmonic resolution, a priori gravity field information, etc. We compare GOCE gravity field estimates based on the aforementioned approaches as computed at the Graz University of Technology, the University of Bern, the University of Stuttgart/Austrian Academy of Sciences, and by RHEA Systems for the European Space Agency. The involved research groups complied with most of the consistency criterions. Deviations only occur where technical unfeasibility exists. Performance measures include formal errors, differences with respect to a state-of-the-art GRACE gravity field, (cumulative) geoid height differences, and SLR residuals from precise orbit determination of geodetic satellites. We found that for the approaches (i) to (iv), the cumulative geoid height differences at spherical harmonic degree 100 differ by only ≈10 % ; in the absence of the polar data gap, SLR residuals agree by ≈96 % . From our investigations, we conclude that real data analysis results are in agreement with the theoretical considerations concerning the (relative) performance of the different approaches.
Resumo:
The article proposes granular computing as a theoretical, formal and methodological basis for the newly emerging research field of human–data interaction (HDI). We argue that the ability to represent and reason with information granules is a prerequisite for data legibility. As such, it allows for extending the research agenda of HDI to encompass the topic of collective intelligence amplification, which is seen as an opportunity of today’s increasingly pervasive computing environments. As an example of collective intelligence amplification in HDI, we introduce a collaborative urban planning use case in a cognitive city environment and show how an iterative process of user input and human-oriented automated data processing can support collective decision making. As a basis for automated human-oriented data processing, we use the spatial granular calculus of granular geometry.
Resumo:
Serpentinites release at sub-arc depths volatiles and several fluid-mobile trace elements found in arc magmas. Constraining element uptake in these rocks and defining the trace element composition of fluids released upon serpentinite dehydration can improve our understanding of mass transfer across subduction zones and to volcanic arcs. The eclogite-facies garnet metaperidotite and chlorite harzburgite bodies embedded in paragneiss of the subduction melange from Cima di Gagnone derive from serpentinized peridotite protoliths and are unique examples of ultramafic rocks that experienced subduction metasomatism and devolatilization. In these rocks, metamorphic olivine and garnet trap polyphase inclusions representing the fluid released during high-pressure breakdown of antigorite and chlorite. Combining major element mapping and laser-ablation ICP-MS bulk inclusion analysis, we characterize the mineral content of polyphase inclusions and quantify the fluid composition. Silicates, Cl-bearing phases, sulphides, carbonates, and oxides document post-entrapment mineral growth in the inclusions starting immediately after fluid entrapment. Compositional data reveal the presence of two different fluid types. The first (type A) records a fluid prominently enriched in fluid-mobile elements, with Cl, Cs, Pb, As, Sb concentrations up to 10(3) PM (primitive mantle), similar to 10(2) PM Tit Ba, while Rb, B, Sr, Li, U concentrations are of the order of 10(1) PM, and alkalis are similar to 2 PM. The second fluid (type B) has considerably lower fluid-mobile element enrichments, but its enrichment patterns are comparable to type A fluid. Our data reveal multistage fluid uptake in these peridotite bodies, including selective element enrichment during seafloor alteration, followed by fluid-rock interaction along with subduction metamorphism in the plate interface melange. Here, infiltration of sediment-equilibrated fluid produced significant enrichment of the serpentinites in As, Sb, B, Pb, an enriched trace element pattern that was then transferred to the fluid released at greater depth upon serpentine dehydration (type A fluid). The type B fluid hosted by garnet may record the composition of the chlorite breakdown fluid released at even greater depth. The Gagnone study-case demonstrates that serpentinized peridotites acquire water and fluid-mobile elements during ocean floor hydration and through exchange with sediment-equilibrated fluids in the early subduction stages. Subsequent antigorite devolatilization at subarc depths delivers aqueous fluids to the mantle wedge that can be prominently enriched in sediment-derived components, potentially triggering arc magmatism without the need of concomitant dehydration/melting of metasediments or altered oceanic crust.
Resumo:
Navigation of deep space probes is most commonly operated using the spacecraft Doppler tracking technique. Orbital parameters are determined from a series of repeated measurements of the frequency shift of a microwave carrier over a given integration time. Currently, both ESA and NASA operate antennas at several sites around the world to ensure the tracking of deep space probes. Just a small number of software packages are nowadays used to process Doppler observations. The Astronomical Institute of the University of Bern (AIUB) has recently started the development of Doppler data processing capabilities within the Bernese GNSS Software. This software has been extensively used for Precise Orbit Determination of Earth orbiting satellites using GPS data collected by on-board receivers and for subsequent determination of the Earth gravity field. In this paper, we present the currently achieved status of the Doppler data modeling and orbit determination capabilities in the Bernese GNSS Software using GRAIL data. In particular we will focus on the implemented orbit determination procedure used for the combined analysis of Doppler and intersatellite Ka-band data. We show that even at this earlier stage of the development we can achieve an accuracy of few mHz on two-way S-band Doppler observation and of 2 µm/s on KBRR data from the GRAIL primary mission phase.