19 resultados para k-Error linear complexity

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

State of the art methods for disparity estimation achieve good results for single stereo frames, but temporal coherence in stereo videos is often neglected. In this paper we present a method to compute temporally coherent disparity maps. We define an energy over whole stereo sequences and optimize their Conditional Random Field (CRF) distributions using mean-field approximation. We introduce novel terms for smoothness and consistency between the left and right views, and perform CRF optimization by fast, iterative spatio-temporal filtering with linear complexity in the total number of pixels. Our results rank among the state of the art while having significantly less flickering artifacts in stereo sequences.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Both historical and idealized climate model experiments are performed with a variety of Earth system models of intermediate complexity (EMICs) as part of a community contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Historical simulations start at 850 CE and continue through to 2005. The standard simulations include changes in forcing from solar luminosity, Earth's orbital configuration, CO2, additional greenhouse gases, land use, and sulphate and volcanic aerosols. In spite of very different modelled pre-industrial global surface air temperatures, overall 20th century trends in surface air temperature and carbon uptake are reasonably well simulated when compared to observed trends. Land carbon fluxes show much more variation between models than ocean carbon fluxes, and recent land fluxes appear to be slightly underestimated. It is possible that recent modelled climate trends or climate–carbon feedbacks are overestimated resulting in too much land carbon loss or that carbon uptake due to CO2 and/or nitrogen fertilization is underestimated. Several one thousand year long, idealized, 2 × and 4 × CO2 experiments are used to quantify standard model characteristics, including transient and equilibrium climate sensitivities, and climate–carbon feedbacks. The values from EMICs generally fall within the range given by general circulation models. Seven additional historical simulations, each including a single specified forcing, are used to assess the contributions of different climate forcings to the overall climate and carbon cycle response. The response of surface air temperature is the linear sum of the individual forcings, while the carbon cycle response shows a non-linear interaction between land-use change and CO2 forcings for some models. Finally, the preindustrial portions of the last millennium simulations are used to assess historical model carbon-climate feedbacks. Given the specified forcing, there is a tendency for the EMICs to underestimate the drop in surface air temperature and CO2 between the Medieval Climate Anomaly and the Little Ice Age estimated from palaeoclimate reconstructions. This in turn could be a result of unforced variability within the climate system, uncertainty in the reconstructions of temperature and CO2, errors in the reconstructions of forcing used to drive the models, or the incomplete representation of certain processes within the models. Given the forcing datasets used in this study, the models calculate significant land-use emissions over the pre-industrial period. This implies that land-use emissions might need to be taken into account, when making estimates of climate–carbon feedbacks from palaeoclimate reconstructions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe here a new reversed-phase high-performance liquid chromatography with mass spectrometry detection method for quantifying intact cytokinin nucleotides in human K-562 leukemia cells. Tandem mass spectrometry was used to identify the intracellular metabolites (cytokinin monophosphorylated, diphosphorylated, and triphosphorylated nucleotides) in riboside-treated cells. For the protein precipitation and sample preparation, a trichloroacetic acid extraction method is used. Samples are then back-extracted with diethyl ether, lyophilized, reconstituted, and injected into the LC system. Analytes were quantified in negative selected ion monitoring mode using a single quadrupole mass spectrometer. The method was validated in terms of retention time stabilities, limits of detection, linearity, recovery, and analytical accuracy. The developed method was linear in the range of 1-1,000 pmol for all studied compounds. The limits of detection for the analytes vary from 0.2 to 0.6 pmol.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the development of meniscal substitutes and related finite element models it is necessary to know the mechanical properties of the meniscus and its attachments. Measurement errors can falsify the determination of material properties. Therefore the impact of metrological and geometrical measurement errors on the determination of the linear modulus of human meniscal attachments was investigated. After total differentiation the error of the force (+0.10%), attachment deformation (−0.16%), and fibre length (+0.11%) measurements almost annulled each other. The error of the cross-sectional area determination ranged from 0.00%, gathered from histological slides, up to 14.22%, obtained from digital calliper measurements. Hence, total measurement error ranged from +0.05% to −14.17%, predominantly affected by the cross-sectional area determination error. Further investigations revealed that the entire cross-section was significantly larger compared to the load-carrying collagen fibre area. This overestimation of the cross-section area led to an underestimation of the linear modulus of up to −36.7%. Additionally, the cross-sections of the collagen-fibre area of the attachments significantly varied up to +90% along their longitudinal axis. The resultant ratio between the collagen fibre area and the histologically determined cross-sectional area ranged between 0.61 for the posterolateral and 0.69 for the posteromedial ligament. The linear modulus of human meniscal attachments can be significantly underestimated due to the use of different methods and locations of cross-sectional area determination. Hence, it is suggested to assess the load carrying collagen fibre area histologically, or, alternatively, to use the correction factors proposed in this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report the identification of quantitative trait loci (QTL) affecting carcass composition, carcass length, fat deposition and lean meat content using a genome scan across 462 animals from a combined intercross and backcross between Hampshire and Landrace pigs. Data were analysed using multiple linear regression fitting additive and dominance effects. This model was compared with a model including a parent-of-origin effect to spot evidence of imprinting. Several precisely defined muscle phenotypes were measured in order to dissect body composition in more detail. Three significant QTL were detected in the study at the 1% genome-wide level, and twelve significant QTL were detected at the 5% genome-wide level. These QTL comprise loci affecting fat deposition and lean meat content on SSC1, 4, 9, 10, 13 and 16, a locus on SSC2 affecting the ratio between weight of meat and bone in back and weight of meat and bone in ham and two loci affecting carcass length on SSC12 and 17. The well-defined phenotypes in this study enabled us to detect QTL for sizes of individual muscles and to obtain information of relevance for the description of the complexity underlying other carcass traits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study is to develop a new simple method for analyzing one-dimensional transcranial magnetic stimulation (TMS) mapping studies in humans. Motor evoked potentials (MEP) were recorded from the abductor pollicis brevis (APB) muscle during stimulation at nine different positions on the scalp along a line passing through the APB hot spot and the vertex. Non-linear curve fitting according to the Levenberg-Marquardt algorithm was performed on the averaged amplitude values obtained at all points to find the best-fitting symmetrical and asymmetrical peak functions. Several peak functions could be fitted to the experimental data. Across all subjects, a symmetric, bell-shaped curve, the complementary error function (erfc) gave the best results. This function is characterized by three parameters giving its amplitude, position, and width. None of the mathematical functions tested with less or more than three parameters fitted better. The amplitude and position parameters of the erfc were highly correlated with the amplitude at the hot spot and with the location of the center of gravity of the TMS curve. In conclusion, non-linear curve fitting is an accurate method for the mathematical characterization of one-dimensional TMS curves. This is the first method that provides information on amplitude, position and width simultaneously.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The Anesthetic Conserving Device (AnaConDa) uncouples delivery of a volatile anesthetic (VA) from fresh gas flow (FGF) using a continuous infusion of liquid volatile into a modified heat-moisture exchanger capable of adsorbing VA during expiration and releasing adsorbed VA during inspiration. It combines the simplicity and responsiveness of high FGF with low agent expenditures. We performed in vitro characterization of the device before developing a population pharmacokinetic model for sevoflurane administration with the AnaConDa, and retrospectively testing its performance (internal validation). MATERIALS AND METHODS: Eighteen females and 20 males, aged 31-87, BMI 20-38, were included. The end-tidal concentrations were varied and recorded together with the VA infusion rates into the device, ventilation and demographic data. The concentration-time course of sevoflurane was described using linear differential equations, and the most suitable structural model and typical parameter values were identified. The individual pharmacokinetic parameters were obtained and tested for covariate relationships. Prediction errors were calculated. RESULTS: In vitro studies assessed the contribution of the device to the pharmacokinetic model. In vivo, the sevoflurane concentration-time courses on the patient side of the AnaConDa were adequately described with a two-compartment model. The population median absolute prediction error was 27% (interquartile range 13-45%). CONCLUSION: The predictive performance of the two-compartment model was similar to that of models accepted for TCI administration of intravenous anesthetics, supporting open-loop administration of sevoflurane with the AnaConDa. Further studies will focus on prospective testing and external validation of the model implemented in a target-controlled infusion device.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BEAMnrc, a code for simulating medical linear accelerators based on EGSnrc, has been bench-marked and used extensively in the scientific literature and is therefore often considered to be the gold standard for Monte Carlo simulations for radiotherapy applications. However, its long computation times make it too slow for the clinical routine and often even for research purposes without a large investment in computing resources. VMC++ is a much faster code thanks to the intensive use of variance reduction techniques and a much faster implementation of the condensed history technique for charged particle transport. A research version of this code is also capable of simulating the full head of linear accelerators operated in photon mode (excluding multileaf collimators, hard and dynamic wedges). In this work, a validation of the full head simulation at 6 and 18 MV is performed, simulating with VMC++ and BEAMnrc the addition of one head component at a time and comparing the resulting phase space files. For the comparison, photon and electron fluence, photon energy fluence, mean energy, and photon spectra are considered. The largest absolute differences are found in the energy fluences. For all the simulations of the different head components, a very good agreement (differences in energy fluences between VMC++ and BEAMnrc <1%) is obtained. Only a particular case at 6 MV shows a somewhat larger energy fluence difference of 1.4%. Dosimetrically, these phase space differences imply an agreement between both codes at the <1% level, making VMC++ head module suitable for full head simulations with considerable gain in efficiency and without loss of accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The neurocognitive processes underlying the formation and maintenance of paranormal beliefs are important for understanding schizotypal ideation. Behavioral studies indicated that both schizotypal and paranormal ideation are based on an overreliance on the right hemisphere, whose coarse rather than focussed semantic processing may favor the emergence of 'loose' and 'uncommon' associations. To elucidate the electrophysiological basis of these behavioral observations, 35-channel resting EEG was recorded in pre-screened female strong believers and disbelievers during resting baseline. EEG data were subjected to FFT-Dipole-Approximation analysis, a reference-free frequency-domain dipole source modeling, and Regional (hemispheric) Omega Complexity analysis, a linear approach estimating the complexity of the trajectories of momentary EEG map series in state space. Compared to disbelievers, believers showed: more right-located sources of the beta2 band (18.5-21 Hz, excitatory activity); reduced interhemispheric differences in Omega complexity values; higher scores on the Magical Ideation scale; more general negative affect; and more hypnagogic-like reveries after a 4-min eyes-closed resting period. Thus, subjects differing in their declared paranormal belief displayed different active, cerebral neural populations during resting, task-free conditions. As hypothesized, believers showed relatively higher right hemispheric activation and reduced hemispheric asymmetry of functional complexity. These markers may constitute the neurophysiological basis for paranormal and schizotypal ideation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global complexity of spontaneous brain electric activity was studied before and after chewing gum without flavor and with 2 different flavors. One-minute, 19-channel, eyes-closed electroencephalograms (EEG) were recorded from 20 healthy males before and after using 3 types of chewing gum: regular gum containing sugar and aromatic additives, gum containing 200 mg theanine (a constituent of Japanese green tea), and gum base (no sugar, no aromatic additives); each was chewed for 5 min in randomized sequence. Brain electric activity was assessed through Global Omega (Ω)-Complexity and Global Dimensional Complexity (GDC), quantitative measures of complexity of the trajectory of EEG map series in state space; their differences from pre-chewing data were compared across gum-chewing conditions. Friedman Anova (p < 0.043) showed that effects on Ω-Complexity differed significantly between conditions and differences were maximal between gum base and theanine gum. No differences were found using GDC. Global Omega-Complexity appears to be a sensitive measure for subtle, central effects of chewing gum with and without flavor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper introduces and analyzes a stochastic search method for parameter estimation in linear regression models in the spirit of Beran and Millar [Ann. Statist. 15(3) (1987) 1131–1154]. The idea is to generate a random finite subset of a parameter space which will automatically contain points which are very close to an unknown true parameter. The motivation for this procedure comes from recent work of Dümbgen et al. [Ann. Statist. 39(2) (2011) 702–730] on regression models with log-concave error distributions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES We sought to analyze the time course of atrial fibrillation (AF) episodes before and after circular plus linear left atrial ablation and the percentage of patients with complete freedom from AF after ablation by using serial seven-day electrocardiograms (ECGs). BACKGROUND The curative treatment of AF targets the pathophysiological corner stones of AF (i.e., the initiating triggers and/or the perpetuation of AF). The pathophysiological complexity of both may not result in an "all-or-nothing" response but may modify number and duration of AF episodes. METHODS In patients with highly symptomatic AF, circular plus linear ablation lesions were placed around the left and right pulmonary veins, between the two circles, and from the left circle to the mitral annulus using the electroanatomic mapping system. Repetitive continuous 7-day ECGs administered before and after catheter ablation were used for rhythm follow-up. RESULTS In 100 patients with paroxysmal (n = 80) and persistent (n = 20) AF, relative duration of time spent in AF significantly decreased over time (35 +/- 37% before ablation, 26 +/- 41% directly after ablation, and 10 +/- 22% after 12 months). Freedom from AF stepwise increased in patients with paroxysmal AF and after 12 months measured at 88% or 74% depending on whether 24-h ECG or 7-day ECG was used. Complete pulmonary vein isolation was demonstrated in <20% of the circular lesions. CONCLUSIONS The results obtained in patients with AF treated with circular plus linear left atrial lesions strongly indicate that substrate modification is the main underlying pathophysiologic mechanism and that it results in a delayed cure instead of an immediate cure.