55 resultados para leave one out cross validation

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes informatics for cross-sample analysis with comprehensive two-dimensional gas chromatography (GCxGC) and high-resolution mass spectrometry (HRMS). GCxGC-HRMS analysis produces large data sets that are rich with information, but highly complex. The size of the data and volume of information requires automated processing for comprehensive cross-sample analysis, but the complexity poses a challenge for developing robust methods. The approach developed here analyzes GCxGC-HRMS data from multiple samples to extract a feature template that comprehensively captures the pattern of peaks detected in the retention-times plane. Then, for each sample chromatogram, the template is geometrically transformed to align with the detected peak pattern and generate a set of feature measurements for cross-sample analyses such as sample classification and biomarker discovery. The approach avoids the intractable problem of comprehensive peak matching by using a few reliable peaks for alignment and peak-based retention-plane windows to define comprehensive features that can be reliably matched for cross-sample analysis. The informatics are demonstrated with a set of 18 samples from breast-cancer tumors, each from different individuals, six each for Grades 1-3. The features allow classification that matches grading by a cancer pathologist with 78% success in leave-one-out cross-validation experiments. The HRMS signatures of the features of interest can be examined for determining elemental compositions and identifying compounds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate and environmental reconstructions from natural archives are important for the interpretation of current climatic change. Few quantitative high-resolution reconstructions exist for South America which is the only land mass extending from the tropics to the southern high latitudes at 56°S. We analyzed sediment cores from two adjacent lakes in Northern Chilean Patagonia, Lago Castor (45°36′S, 71°47′W) and Laguna Escondida (45°31′S, 71°49′W). Radiometric dating (210Pb, 137Cs, 14C-AMS) suggests that the cores reach back to c. 900 BC (Laguna Escondida) and c. 1900 BC (Lago Castor). Both lakes show similarities and reproducibility in sedimentation rate changes and tephra layer deposition. We found eight macroscopic tephras (0.2–5.5 cm thick) dated at 1950 BC, 1700 BC, at 300 BC, 50 BC, 90 AD, 160 AD, 400 AD and at 900 AD. These can be used as regional time-synchronous stratigraphic markers. The two thickest tephras represent known well-dated explosive eruptions of Hudson volcano around 1950 and 300 BC. Biogenic silica flux revealed in both lakes a climate signal and correlation with annual temperature reanalysis data (calibration 1900–2006 AD; Lago Castor r = 0.37; Laguna Escondida r = 0.42, seven years filtered data). We used a linear inverse regression plus scaling model for calibration and leave-one-out cross-validation (RMSEv = 0.56 °C) to reconstruct sub decadal-scale temperature variability for Laguna Escondida back to AD 400. The lower part of the core from Laguna Escondida prior to AD 400 and the core of Lago Castor are strongly influenced by primary and secondary tephras and, therefore, not used for the temperature reconstruction. The temperature reconstruction from Laguna Escondida shows cold conditions in the 5th century (relative to the 20th century mean), warmer temperatures from AD 600 to AD 1150 and colder temperatures from AD 1200 to AD 1450. From AD 1450 to AD 1700 our reconstruction shows a period with stronger variability and on average higher values than the 20th century mean. Until AD 1900 the temperature values decrease but stay slightly above the 20th century mean. Most of the centennial-scale features are reproduced in the few other natural climate archives in the region. The early onset of cool conditions from c. AD 1200 onward seems to be confirmed for this region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Proper delineation of ocular anatomy in 3D imaging is a big challenge, particularly when developing treatment plans for ocular diseases. Magnetic Resonance Imaging (MRI) is nowadays utilized in clinical practice for the diagnosis confirmation and treatment planning of retinoblastoma in infants, where it serves as a source of information, complementary to the Fundus or Ultrasound imaging. Here we present a framework to fully automatically segment the eye anatomy in the MRI based on 3D Active Shape Models (ASM), we validate the results and present a proof of concept to automatically segment pathological eyes. Material and Methods: Manual and automatic segmentation were performed on 24 images of healthy children eyes (3.29±2.15 years). Imaging was performed using a 3T MRI scanner. The ASM comprises the lens, the vitreous humor, the sclera and the cornea. The model was fitted by first automatically detecting the position of the eye center, the lens and the optic nerve, then aligning the model and fitting it to the patient. We validated our segmentation method using a leave-one-out cross validation. The segmentation results were evaluated by measuring the overlap using the Dice Similarity Coefficient (DSC) and the mean distance error. Results: We obtained a DSC of 94.90±2.12% for the sclera and the cornea, 94.72±1.89% for the vitreous humor and 85.16±4.91% for the lens. The mean distance error was 0.26±0.09mm. The entire process took 14s on average per eye. Conclusion: We provide a reliable and accurate tool that enables clinicians to automatically segment the sclera, the cornea, the vitreous humor and the lens using MRI. We additionally present a proof of concept for fully automatically segmenting pathological eyes. This tool reduces the time needed for eye shape delineation and thus can help clinicians when planning eye treatment and confirming the extent of the tumor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nitazoxanide (2-acetolyloxy-N-(5-nitro 2-thiazolyl) benzamide; NTZ) represents the parent compound of a novel class of broad-spectrum anti-parasitic compounds named thiazolides. NTZ is active against a wide variety of intestinal and tissue-dwelling helminths, protozoa, enteric bacteria and a number of viruses infecting animals and humans. While potent, this poses a problem in practice, since this obvious non-selectivity can lead to undesired side effects in both humans and animals. In this study, we used real time PCR to determine the in vitro activities of 29 different thiazolides (NTZ-derivatives), which carry distinct modifications on both the thiazole- and the benzene moieties, against the tachyzoite stage of the intracellular protozoan Neospora caninum. The goal was to identify a highly active compound lacking the undesirable nitro group, which would have a more specific applicability, such as in food animals. By applying self-organizing molecular field analysis (SOMFA), these data were used to develop a predictive model for future drug design. SOMFA performs self-alignment of the molecules, and takes into account the steric and electrostatic properties, in order to determine 3D-quantitative structure activity relationship models. The best model was obtained by overlay of the thiazole moieties. Plotting of predicted versus experimentally determined activity produced an r2 value of 0.8052 and cross-validation using the "leave one out" methodology resulted in a q2 value of 0.7987. A master grid map showed that large steric groups at the R2 position, the nitrogen of the amide bond and position Y could greatly reduce activity, and the presence of large steric groups placed at positions X, R4 and surrounding the oxygen atom of the amide bond, may increase the activity of thiazolides against Neospora caninum tachyzoites. The model obtained here will be an important predictive tool for future development of this important class of drugs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Constructing a 3D surface model from sparse-point data is a nontrivial task. Here, we report an accurate and robust approach for reconstructing a surface model of the proximal femur from sparse-point data and a dense-point distribution model (DPDM). The problem is formulated as a three-stage optimal estimation process. The first stage, affine registration, is to iteratively estimate a scale and a rigid transformation between the mean surface model of the DPDM and the sparse input points. The estimation results of the first stage are used to establish point correspondences for the second stage, statistical instantiation, which stably instantiates a surface model from the DPDM using a statistical approach. This surface model is then fed to the third stage, kernel-based deformation, which further refines the surface model. Handling outliers is achieved by consistently employing the least trimmed squares (LTS) approach with a roughly estimated outlier rate in all three stages. If an optimal value of the outlier rate is preferred, we propose a hypothesis testing procedure to automatically estimate it. We present here our validations using four experiments, which include 1 leave-one-out experiment, 2 experiment on evaluating the present approach for handling pathology, 3 experiment on evaluating the present approach for handling outliers, and 4 experiment on reconstructing surface models of seven dry cadaver femurs using clinically relevant data without noise and with noise added. Our validation results demonstrate the robust performance of the present approach in handling outliers, pathology, and noise. An average 95-percentile error of 1.7-2.3 mm was found when the present approach was used to reconstruct surface models of the cadaver femurs from sparse-point data with noise added.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reconstruction of shape and intensity from 2D x-ray images has drawn more and more attentions. Previously introduced work suffers from the long computing time due to its iterative optimization characteristics and the requirement of generating digitally reconstructed radiographs within each iteration. In this paper, we propose a novel method which uses a patient-specific 3D surface model reconstructed from 2D x-ray images as a surrogate to get a patient-specific volumetric intensity reconstruction via partial least squares regression. No DRR generation is needed. The method was validated on 20 cadaveric proximal femurs by performing a leave-one-out study. Qualitative and quantitative results demonstrated the efficacy of the present method. Compared to the existing work, the present method has the advantage of much shorter computing time and can be applied to both DXA images as well as conventional x-ray images, which may hold the potentials to be applied to clinical routine task such as total hip arthroplasty (THA).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chrysophyte cysts are recognized as powerful proxies of cold-season temperatures. In this paper we use the relationship between chrysophyte assemblages and the number of days below 4 °C (DB4 °C) in the epilimnion of a lake in northern Poland to develop a transfer function and to reconstruct winter severity in Poland for the last millennium. DB4 °C is a climate variable related to the length of the winter. Multivariate ordination techniques were used to study the distribution of chrysophytes from sediment traps of 37 low-land lakes distributed along a variety of environmental and climatic gradients in northern Poland. Of all the environmental variables measured, stepwise variable selection and individual Redundancy analyses (RDA) identified DB4 °C as the most important variable for chrysophytes, explaining a portion of variance independent of variables related to water chemistry (conductivity, chlorides, K, sulfates), which were also important. A quantitative transfer function was created to estimate DB4 °C from sedimentary assemblages using partial least square regression (PLS). The two-component model (PLS-2) had a coefficient of determination of View the MathML sourceRcross2 = 0.58, with root mean squared error of prediction (RMSEP, based on leave-one-out) of 3.41 days. The resulting transfer function was applied to an annually-varved sediment core from Lake Żabińskie, providing a new sub-decadal quantitative reconstruction of DB4 °C with high chronological accuracy for the period AD 1000–2010. During Medieval Times (AD 1180–1440) winters were generally shorter (warmer) except for a decade with very long and severe winters around AD 1260–1270 (following the AD 1258 volcanic eruption). The 16th and 17th centuries and the beginning of the 19th century experienced very long severe winters. Comparison with other European cold-season reconstructions and atmospheric indices for this region indicates that large parts of the winter variability (reconstructed DB4 °C) is due to the interplay between the oscillations of the zonal flow controlled by the North Atlantic Oscillation (NAO) and the influence of continental anticyclonic systems (Siberian High, East Atlantic/Western Russia pattern). Differences with other European records are attributed to geographic climatological differences between Poland and Western Europe (Low Countries, Alps). Striking correspondence between the combined volcanic and solar forcing and the DB4 °C reconstruction prior to the 20th century suggests that winter climate in Poland responds mostly to natural forced variability (volcanic and solar) and the influence of unforced variability is low.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Patient-specific biomechanical models including local bone mineral density and anisotropy have gained importance for assessing musculoskeletal disorders. However the trabecular bone anisotropy captured by high-resolution imaging is only available at the peripheral skeleton in clinical practice. In this work, we propose a supervised learning approach to predict trabecular bone anisotropy that builds on a novel set of pose invariant feature descriptors. The statistical relationship between trabecular bone anisotropy and feature descriptors were learned from a database of pairs of high resolution QCT and clinical QCT reconstructions. On a set of leave-one-out experiments, we compared the accuracy of the proposed approach to previous ones, and report a mean prediction error of 6% for the tensor norm, 6% for the degree of anisotropy and 19◦ for the principal tensor direction. These findings show the potential of the proposed approach to predict trabecular bone anisotropy from clinically available QCT images.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Finite element (FE) analysis is an important computational tool in biomechanics. However, its adoption into clinical practice has been hampered by its computational complexity and required high technical competences for clinicians. In this paper we propose a supervised learning approach to predict the outcome of the FE analysis. We demonstrate our approach on clinical CT and X-ray femur images for FE predictions ( FEP), with features extracted, respectively, from a statistical shape model and from 2D-based morphometric and density information. Using leave-one-out experiments and sensitivity analysis, comprising a database of 89 clinical cases, our method is capable of predicting the distribution of stress values for a walking loading condition with an average correlation coefficient of 0.984 and 0.976, for CT and X-ray images, respectively. These findings suggest that supervised learning approaches have the potential to leverage the clinical integration of mechanical simulations for the treatment of musculoskeletal conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, we demonstrate the power of applying complementary DNA (cDNA) microarray technology to identifying candidate loci that exhibit subtle differences in expression levels associated with a complex trait in natural populations of a nonmodel organism. Using a highly replicated experimental design involving 180 cDNA microarray experiments, we measured gene-expression levels from 1098 transcript probes in 90 individuals originating from six brown trout (Salmo trutta) and one Atlantic salmon (Salmo salar) population, which follow either a migratory or a sedentary life history. We identified several candidate genes associated with preparatory adaptations to different life histories in salmonids, including genes encoding for transaldolase 1, constitutive heat-shock protein HSC70-1 and endozepine. Some of these genes clustered into functional groups, providing insight into the physiological pathways potentially involved in the expression of life-history related phenotypic differences. Such differences included the down-regulation of genes involved in the respiratory system of future migratory individuals. In addition, we used linear discriminant analysis to identify a set of 12 genes that correctly classified immature individuals as migratory or sedentary with high accuracy. Using the expression levels of these 12 genes, 17 out of 18 individuals used for cross-validation were correctly assigned to their respective life-history phenotype. Finally, we found various candidate genes associated with physiological changes that are likely to be involved in preadaptations to seawater in anadromous populations of the genus Salmo, one of which was identified to encode for nucleophosmin 1. Our findings thus provide new molecular insights into salmonid life-history variation, opening new perspectives in the study of this complex trait.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Index tracking has become one of the most common strategies in asset management. The index-tracking problem consists of constructing a portfolio that replicates the future performance of an index by including only a subset of the index constituents in the portfolio. Finding the most representative subset is challenging when the number of stocks in the index is large. We introduce a new three-stage approach that at first identifies promising subsets by employing data-mining techniques, then determines the stock weights in the subsets using mixed-binary linear programming, and finally evaluates the subsets based on cross validation. The best subset is returned as the tracking portfolio. Our approach outperforms state-of-the-art methods in terms of out-of-sample performance and running times.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Carrington Event of 1859 is considered to be among the largest space weather events of the last 150 years. We show that only one out of 14 well-resolved ice core records from Greenland and Antarctica has a nitrate spike dated to 1859. No sharp spikes are observed in the Antarctic cores studied here. In Greenland numerous spikes are observed in the 40 years surrounding 1859, but where other chemistry was measured, all large spikes have the unequivocal signal, including co-located spikes in ammonium, formate, black carbon and vanillic acid, of biomass burning plumes. It seems certain that most spikes in an earlier core, including that claimed for 1859, are also due to biomass burning plumes, and not to solar energetic particle (SEP) events. We conclude that an event as large as the Carrington Event did not leave an observable, widespread imprint in nitrate in polar ice. Nitrate spikes cannot be used to derive the statistics of SEPs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present an independent calibration model for the determination of biogenic silica (BSi) in sediments, developed from analysis of synthetic sediment mixtures and application of Fourier transform infrared spectroscopy (FTIRS) and partial least squares regression (PLSR) modeling. In contrast to current FTIRS applications for quantifying BSi, this new calibration is independent from conventional wet-chemical techniques and their associated measurement uncertainties. This approach also removes the need for developing internal calibrations between the two methods for individual sediments records. For the independent calibration, we produced six series of different synthetic sediment mixtures using two purified diatom extracts, with one extract mixed with quartz sand, calcite, 60/40 quartz/calcite and two different natural sediments, and a second extract mixed with one of the natural sediments. A total of 306 samples—51 samples per series—yielded BSi contents ranging from 0 to 100 %. The resulting PLSR calibration model between the FTIR spectral information and the defined BSi concentration of the synthetic sediment mixtures exhibits a strong cross-validated correlation ( R2cv = 0.97) and a low root-mean square error of cross-validation (RMSECV = 4.7 %). Application of the independent calibration to natural lacustrine and marine sediments yields robust BSi reconstructions. At present, the synthetic mixtures do not include the variation in organic matter that occurs in natural samples, which may explain the somewhat lower prediction accuracy of the calibration model for organic-rich samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND The nine equivalents of nursing manpower use score (NEMS) is used to evaluate critical care nursing workload and occasionally to define hospital reimbursements. Little is known about the caregivers' accuracy in scoring, about factors affecting this accuracy and how validity of scoring is assured. METHODS Accuracy in NEMS scoring of Swiss critical care nurses was assessed using case vignettes. An online survey was performed to assess training and quality control of NEMS scoring and to collect structural and organizational data of participating intensive care units (ICUs). Aggregated structural and procedural data of the Swiss ICU Minimal Data Set were used for matching. RESULTS Nursing staff from 64 (82%) of the 78 certified adult ICUs participated in this survey. Training and quality control of scoring shows large variability between ICUs. A total of 1378 nurses scored one out of 20 case vignettes: accuracy ranged from 63.7% (intravenous medications) to 99.1% (basic monitoring). Erroneous scoring (8.7% of all items) was more frequent than omitted scoring (3.2%). Mean NEMS per case was 28.0 ± 11.8 points (reference score: 25.7 ± 14.2 points). Mean bias was 2.8 points (95% confidence interval: 1.0-4.7); scores below 37.1 points were generally overestimated. Data from units with a greater nursing management staff showed a higher bias. CONCLUSION Overall, nurses assess the NEMS score within a clinically acceptable range. Lower scores are generally overestimated. Inaccurate assessment was associated with a greater size of the nursing management staff. Swiss head nurses consider themselves motivated to assure appropriate scoring and its validation.