933 resultados para Software Package Data Exchange (SPDX)
Resumo:
BACKGROUND Lung clearance index (LCI), a marker of ventilation inhomogeneity, is elevated early in children with cystic fibrosis (CF). However, in infants with CF, LCI values are found to be normal, although structural lung abnormalities are often detectable. We hypothesized that this discrepancy is due to inadequate algorithms of the available software package. AIM Our aim was to challenge the validity of these software algorithms. METHODS We compared multiple breath washout (MBW) results of current software algorithms (automatic modus) to refined algorithms (manual modus) in 17 asymptomatic infants with CF, and 24 matched healthy term-born infants. The main difference between these two analysis methods lies in the calculation of the molar mass differences that the system uses to define the completion of the measurement. RESULTS In infants with CF the refined manual modus revealed clearly elevated LCI above 9 in 8 out of 35 measurements (23%), all showing LCI values below 8.3 using the automatic modus (paired t-test comparing the means, P < 0.001). Healthy infants showed normal LCI values using both analysis methods (n = 47, paired t-test, P = 0.79). The most relevant reason for false normal LCI values in infants with CF using the automatic modus was the incorrect recognition of the end-of-test too early during the washout. CONCLUSION We recommend the use of the manual modus for the analysis of MBW outcomes in infants in order to obtain more accurate results. This will allow appropriate use of infant lung function results for clinical and scientific purposes.
Resumo:
High Angular Resolution Diffusion Imaging (HARDI) techniques, including Diffusion Spectrum Imaging (DSI), have been proposed to resolve crossing and other complex fiber architecture in the human brain white matter. In these methods, directional information of diffusion is inferred from the peaks in the orientation distribution function (ODF). Extensive studies using histology on macaque brain, cat cerebellum, rat hippocampus and optic tracts, and bovine tongue are qualitatively in agreement with the DSI-derived ODFs and tractography. However, there are only two studies in the literature which validated the DSI results using physical phantoms and both these studies were not performed on a clinical MRI scanner. Also, the limited studies which optimized DSI in a clinical setting, did not involve a comparison against physical phantoms. Finally, there is lack of consensus on the necessary pre- and post-processing steps in DSI; and ground truth diffusion fiber phantoms are not yet standardized. Therefore, the aims of this dissertation were to design and construct novel diffusion phantoms, employ post-processing techniques in order to systematically validate and optimize (DSI)-derived fiber ODFs in the crossing regions on a clinical 3T MR scanner, and develop user-friendly software for DSI data reconstruction and analysis. Phantoms with a fixed crossing fiber configuration of two crossing fibers at 90° and 45° respectively along with a phantom with three crossing fibers at 60°, using novel hollow plastic capillaries and novel placeholders, were constructed. T2-weighted MRI results on these phantoms demonstrated high SNR, homogeneous signal, and absence of air bubbles. Also, a technique to deconvolve the response function of an individual peak from the overall ODF was implemented, in addition to other DSI post-processing steps. This technique greatly improved the angular resolution of the otherwise unresolvable peaks in a crossing fiber ODF. The effects of DSI acquisition parameters and SNR on the resultant angular accuracy of DSI on the clinical scanner were studied and quantified using the developed phantoms. With a high angular direction sampling and reasonable levels of SNR, quantification of a crossing region in the 90°, 45° and 60° phantoms resulted in a successful detection of angular information with mean ± SD of 86.93°±2.65°, 44.61°±1.6° and 60.03°±2.21° respectively, while simultaneously enhancing the ODFs in regions containing single fibers. For the applicability of these validated methodologies in DSI, improvement in ODFs and fiber tracking from known crossing fiber regions in normal human subjects were demonstrated; and an in-house software package in MATLAB which streamlines the data reconstruction and post-processing for DSI, with easy to use graphical user interface was developed. In conclusion, the phantoms developed in this dissertation offer a means of providing ground truth for validation of reconstruction and tractography algorithms of various diffusion models (including DSI). Also, the deconvolution methodology (when applied as an additional DSI post-processing step) significantly improved the angular accuracy of the ODFs obtained from DSI, and should be applicable to ODFs obtained from the other high angular resolution diffusion imaging techniques.
Resumo:
The core descriptions (chapter 7) summarize the most important results of the analysis of each sediment core following procedures applied during ODP/IODP expeditions. All cores were opened, described, and color-scanned. In the core descriptions the first column displays the lithological data that are based on visual analysis of the core and are supplemented by information from binocular and smear slide analyses. The sediment classification largely follows ODP/IODP convention. Lithological names consist of a principal name based on composition, degree of lithification, and/or texture as determined from visual description and microscopic observations. In the structure column the intensity of bioturbation together with individual or special features (turbidites, volcanic ash layers, plant debris, shell fragments, etc.) is shown. The hue and chroma attributes of color were determined by comparison with the Munsell soil color charts and are given in the color column in the Munsell notation. A GretagMacbethTM Spectrolino spectrophotometer was used to measure percent reflectance values of sediment color at 36 wavelength channels over the visible light range (380-730 nm) on all of the cores. The digital reflectance data of the spectrophotometer readings were routinely obtained from the surface (measured in 1 cm steps) of the split cores (archive half). The Spectrolino is equipped with a measuring aperture with folding mechanism allowing an exact positioning on the split core and is connected to a portable computer. The data are directly displayed within the software package Excel and can be controlled simultaneously. From all the color measurements, for each core the red/blue ratio (700 nm/450 nm) and the lightness are shown together with the visual core description. The reflectance of individual wavelengths is often significantly affected by the presence of minor amounts of oxyhydroxides or sulphides. To eliminate these effects, we used the red/blue ratio and lightness.
Resumo:
Sr isotope analyses have been conducted on anhydrite samples from the TAG (Trans-Atlantic Geotraverse) active hydrothermal mound (26°08?N, Mid-Atlantic Ridge) that have previously been shown to exhibit two distinct patterns of REE behavior when normalized to TAG end-member hydrothermal fluid. Despite differences in REE patterns, the Sr isotope data indicate that all the anhydrites precipitated from fluids with a similar range of hydrothermal fluid and seawater components, and all but one were seawater-dominated (52%-75%). Speciation calculations using the EQ3/6 software package for geochemical modeling of aqueous systems suggest that the REE complexation behavior in different fluid mixing scenarios can explain the variations in the REE patterns. Anhydrites that exhibit relatively flat REE patterns [(La_bs)/(Yb_bs) = 0.8-2.0; subscript bs indicates normalization to end-member black smoker hydrothermal fluid] and a small or no Eu anomaly [(Eu_bs)/(Eu*_bs) = 0.8-2.0] are inferred to have precipitated from mixes of end-member hydrothermal fluid and cold seawater. REE complexes with hard ligands (e.g., fluoride and chloride) are less stable at low temperatures and trivalent Eu has an ionic radius similar to that of Ca2+ and the other REE, and so they behave coherently. In contrast, anhydrites that exhibit slight LREE-depletion [(La_bs)/(Yb_bs) = 0.4-1.4] and a distinct negative anomaly [(Eu_bs)/(Eu*_bs) = 0.2-0.8] are inferred to have precipitated from mixes of end-member hydrothermal fluid and conductively heated seawater. The LREE depletion results from the presence of very stable LREE chloro-complexes that effectively limit the availability of the LREE for partitioning into anhydrite. Above 250°C, Eu is present only in divalent form as chloride complexes, and discrimination against Eu2+ is likely due to both the mismatch in ionic radii between Eu2+ and Ca2+, and the strong chloro-complexation of divalent Eu which promotes stability in the fluid and inhibits partitioning of Eu2+ into precipitating anhydrite. These variations in REE behavior attest to rapid fluctuations in thermal regime, fluid flow and mixing in the subsurface of the TAG mound that give rise to heterogeneity in the formation conditions of individual anhydrite crystals.
Resumo:
We introduce two probabilistic, data-driven models that predict a ship's speed and the situations where a ship is probable to get stuck in ice based on the joint effect of ice features such as the thickness and concentration of level ice, ice ridges, rafted ice, moreover ice compression is considered. To develop the models to datasets were utilized. First, the data from the Automatic Identification System about the performance of a selected ship was used. Second, a numerical ice model HELMI, developed in the Finnish Meteorological Institute, provided information about the ice field. The relations between the ice conditions and ship movements were established using Bayesian learning algorithms. The case study presented in this paper considers a single and unassisted trip of an ice-strengthened bulk carrier between two Finnish ports in the presence of challenging ice conditions, which varied in time and space. The obtained results show good prediction power of the models. This means, on average 80% for predicting the ship's speed within specified bins, and above 90% for predicting cases where a ship may get stuck in ice. We expect this new approach to facilitate the safe and effective route selection problem for ice-covered waters where the ship performance is reflected in the objective function.
Resumo:
Maritime accidents involving ships carrying passengers may pose a high risk with respect to human casualties. For effective risk mitigation, an insight into the process of risk escalation is needed. This requires a proactive approach when it comes to risk modelling for maritime transportation systems. Most of the existing models are based on historical data on maritime accidents, and thus they can be considered reactive instead of proactive. This paper introduces a systematic, transferable and proactive framework estimating the risk for maritime transportation systems, meeting the requirements stemming from the adopted formal definition of risk. The framework focuses on ship-ship collisions in the open sea, with a RoRo/Passenger ship (RoPax) being considered as the struck ship. First, it covers an identification of the events that follow a collision between two ships in the open sea, and, second, it evaluates the probabilities of these events, concluding by determining the severity of a collision. The risk framework is developed with the use of Bayesian Belief Networks and utilizes a set of analytical methods for the estimation of the risk model parameters. The model can be run with the use of GeNIe software package. Finally, a case study is presented, in which the risk framework developed here is applied to a maritime transportation system operating in the Gulf of Finland (GoF). The results obtained are compared to the historical data and available models, in which a RoPax was involved in a collision, and good agreement with the available records is found.
Resumo:
The BSRN Toolbox is a software package supplied by the WRMC and is freely available to all station scientists and data users. The main features of the package include a download manager for Station- to-Archive files, a tool to convert files into human readable TAB-separated ASCII-tables (similar to those output by the PANGAEA database), and a tool to check data sets for violations of the "BSRN Global Network recommended QC tests, V2.0" quality criteria. The latter tool creates quality codes, one per measured value, indicating if the data are "physically possible," "extremely rare," or if "intercomparison limits are exceeded." In addition, auxiliary data such as solar zenith angle or global calculated from diffuse and direct can be output. All output from the QC tool can be visualized using PanPlot (doi:10.1594/PANGAEA.816201).
Resumo:
We have analyzed the performance of a PET demonstrator formed by two sectors of four monolithic detector blocks placed face-to-face. Both front-end and read-out electronics have been evaluated by means of coincidence measurements using a rotating 22Na source placed at the center of the sectors in order to emulate the behavior of a complete full ring. A continuous training method based on neural network (NN) algorithms has been carried out to determine the entrance points over the surface of the detectors. Reconstructed images from 1 MBq 22Na point source and 22Na Derenzo phantom have been obtained using both filtered back projection (FBP) analytic methods and the OSEM 3D iterative algorithm available in the STIR software package [1]. Preliminary data on image reconstruction from a 22Na point source with Ø = 0.25 mm show spatial resolutions from 1.7 to 2.1 mm FWHM in the transverse plane. The results confirm the viability of this design for the development of a full-ring brain PET scanner compatible with magnetic resonance imaging for human studies.
Resumo:
The algorithms and graphic user interface software package ?OPT-PROx? are developed to meet food engineering needs related to canned food thermal processing simulation and optimization. The adaptive random search algorithm and its modification coupled with penalty function?s approach, and the finite difference methods with cubic spline approximation are utilized by ?OPT-PROx? package (http://tomakechoice. com/optprox/index.html). The diversity of thermal food processing optimization problems with different objectives and required constraints are solvable by developed software. The geometries supported by the ?OPT-PROx? are the following: (1) cylinder, (2) rectangle, (3) sphere. The mean square error minimization principle is utilized in order to estimate the heat transfer coefficient of food to be heated under optimal condition. The developed user friendly dialogue and used numerical procedures makes the ?OPT-PROx? software useful to food scientists in research and education, as well as to engineers involved in optimization of thermal food processing.
Resumo:
The analysis of the interdependence between time series has become an important field of research in the last years, mainly as a result of advances in the characterization of dynamical systems from the signals they produce, the introduction of concepts such as generalized and phase synchronization and the application of information theory to time series analysis. In neurophysiology, different analytical tools stemming from these concepts have added to the ‘traditional’ set of linear methods, which includes the cross-correlation and the coherency function in the time and frequency domain, respectively, or more elaborated tools such as Granger Causality. This increase in the number of approaches to tackle the existence of functional (FC) or effective connectivity (EC) between two (or among many) neural networks, along with the mathematical complexity of the corresponding time series analysis tools, makes it desirable to arrange them into a unified-easy-to-use software package. The goal is to allow neuroscientists, neurophysiologists and researchers from related fields to easily access and make use of these analysis methods from a single integrated toolbox. Here we present HERMES (http://hermes.ctb.upm.es), a toolbox for the Matlab® environment (The Mathworks, Inc), which is designed to study functional and effective brain connectivity from neurophysiological data such as multivariate EEG and/or MEG records. It includes also visualization tools and statistical methods to address the problem of multiple comparisons. We believe that this toolbox will be very helpful to all the researchers working in the emerging field of brain connectivity analysis.
Resumo:
The analysis of the interdependence between time series has become an important field of research in the last years, mainly as a result of advances in the characterization of dynamical systems from the signals they produce, the introduction of concepts such as generalized and phase synchronization and the application of information theory to time series analysis. In neurophysiology, different analytical tools stemming from these concepts have added to the ?traditional? set of linear methods, which includes the cross-correlation and the coherency function in the time and frequency domain, respectively, or more elaborated tools such as Granger Causality. This increase in the number of approaches to tackle the existence of functional (FC) or effective connectivity (EC) between two (or among many) neural networks, along with the mathematical complexity of the corresponding time series analysis tools, makes it desirable to arrange them into a unified, easy-to-use software package. The goal is to allow neuroscientists, neurophysiologists and researchers from related fields to easily access and make use of these analysis methods from a single integrated toolbox. Here we present HERMES (http://hermes.ctb.upm.es), a toolbox for the Matlab® environment (The Mathworks, Inc), which is designed to study functional and effective brain connectivity from neurophysiological data such as multivariate EEG and/or MEG records. It includes also visualization tools and statistical methods to address the problem of multiple comparisons. We believe that this toolbox will be very helpful to all the researchers working in the emerging field of brain connectivity analysis.