25 resultados para Palaeomagnetism Applied to Geologic Processes
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
We study existence of random elements with partially specified distributions. The technique relies on the existence of a positive ex-tension for linear functionals accompanied by additional conditions that ensure the regularity of the extension needed for interpreting it as a probability measure. It is shown in which case the extens ion can be chosen to possess some invariance properties. The results are applied to the existence of point processes with given correlation measure and random closed sets with given two-point covering function or contact distribution function. It is shown that the regularity condition can be efficiently checked in many cases in order to ensure that the obtained point processes are indeed locally finite and random sets have closed realisations.
Resumo:
A global metabolic profiling methodology based on gas chromatography coupled to time-of-flight mass spectrometry (GC-TOFMS) for human plasma was applied to a human exercise study focused on the effects of beverages containing glucose, galactose, or fructose taken after exercise and throughout a recovery period of 6 h and 45 min. One group of 10 well trained male cyclists performed 3 experimental sessions on separate days (randomized, single center). After performing a standardized depletion protocol on a bicycle, subjects consumed one of three different beverages: maltodextrin (MD)+glucose (2:1 ratio), MD+galactose (2:1), and MD+fructose (2:1), consumed at an average of 1.25 g of carbohydrate (CHO) ingested per minute. Blood was taken straight after exercise and every 45 min within the recovery phase. With the resulting blood plasma, insulin, free fatty acid (FFA) profile, glucose, and GC-TOFMS global metabolic profiling measurements were performed. The resulting profiling data was able to match the results obtained from the other clinical measurements with the addition of being able to follow many different metabolites throughout the recovery period. The data quality was assessed, with all the labelled internal standards yielding values of <15% CV for all samples (n=335), apart from the labelled sucrose which gave a value of 15.19%. Differences between recovery treatments including the appearance of galactonic acid from the galactose based beverage were also highlighted.
Resumo:
When estimating the effect of treatment on HIV using data from observational studies, standard methods may produce biased estimates due to the presence of time-dependent confounders. Such confounding can be present when a covariate, affected by past exposure, is both a predictor of the future exposure and the outcome. One example is the CD4 cell count, being a marker for disease progression for HIV patients, but also a marker for treatment initiation and influenced by treatment. Fitting a marginal structural model (MSM) using inverse probability weights is one way to give appropriate adjustment for this type of confounding. In this paper we study a simple and intuitive approach to estimate similar treatment effects, using observational data to mimic several randomized controlled trials. Each 'trial' is constructed based on individuals starting treatment in a certain time interval. An overall effect estimate for all such trials is found using composite likelihood inference. The method offers an alternative to the use of inverse probability of treatment weights, which is unstable in certain situations. The estimated parameter is not identical to the one of an MSM, it is conditioned on covariate values at the start of each mimicked trial. This allows the study of questions that are not that easily addressed fitting an MSM. The analysis can be performed as a stratified weighted Cox analysis on the joint data set of all the constructed trials, where each trial is one stratum. The model is applied to data from the Swiss HIV cohort study.
Resumo:
Background The World Health Organization estimates that in sub-Saharan Africa about 4 million HIV-infected patients had started antiretroviral therapy (ART) by the end of 2008. Loss of patients to follow-up and care is an important problem for treatment programmes in this region. As mortality is high in these patients compared to patients remaining in care, ART programmes with high rates of loss to follow-up may substantially underestimate mortality of all patients starting ART. Methods and Findings We developed a nomogram to correct mortality estimates for loss to follow-up, based on the fact that mortality of all patients starting ART in a treatment programme is a weighted average of mortality among patients lost to follow-up and patients remaining in care. The nomogram gives a correction factor based on the percentage of patients lost to follow-up at a given point in time, and the estimated ratio of mortality between patients lost and not lost to follow-up. The mortality observed among patients retained in care is then multiplied by the correction factor to obtain an estimate of programme-level mortality that takes all deaths into account. A web calculator directly calculates the corrected, programme-level mortality with 95% confidence intervals (CIs). We applied the method to 11 ART programmes in sub-Saharan Africa. Patients retained in care had a mortality at 1 year of 1.4% to 12.0%; loss to follow-up ranged from 2.8% to 28.7%; and the correction factor from 1.2 to 8.0. The absolute difference between uncorrected and corrected mortality at 1 year ranged from 1.6% to 9.8%, and was above 5% in four programmes. The largest difference in mortality was in a programme with 28.7% of patients lost to follow-up at 1 year. Conclusions The amount of bias in mortality estimates can be large in ART programmes with substantial loss to follow-up. Programmes should routinely report mortality among patients retained in care and the proportion of patients lost. A simple nomogram can then be used to estimate mortality among all patients who started ART, for a range of plausible mortality rates among patients lost to follow-up.
Resumo:
Many methodologies dealing with prediction or simulation of soft tissue deformations on medical image data require preprocessing of the data in order to produce a different shape representation that complies with standard methodologies, such as mass–spring networks, finite element method s (FEM). On the other hand, methodologies working directly on the image space normally do not take into account mechanical behavior of tissues and tend to lack physics foundations driving soft tissue deformations. This chapter presents a method to simulate soft tissue deformations based on coupled concepts from image analysis and mechanics theory. The proposed methodology is based on a robust stochastic approach that takes into account material properties retrieved directly from the image, concepts from continuum mechanics and FEM. The optimization framework is solved within a hierarchical Markov random field (HMRF) which is implemented on the graphics processor unit (GPU See Graphics processing unit ).
Resumo:
OBJECTIVE: To evaluate fixation properties of a new intervertebral anchored fusion device and compare these with ventral locking plate fixation. STUDY DESIGN: In vitro biomechanical evaluation. ANIMALS: Cadaveric canine C4-C7 cervical spines (n = 9). METHODS: Cervical spines were nondestructively loaded with pure moments in a nonconstraining testing apparatus to induce flexion/extension while angular motion was measured. Range of motion (ROM) and neutral zone (NZ) were calculated for (1) intact specimens, (2) specimens after discectomy and fixation with a purpose-built intervertebral fusion cage with integrated ventral fixation, and (3) after removal of the device and fixation with a ventral locking plate. RESULTS: Both fixation techniques resulted in a decrease in ROM and NZ (P < .001) compared with the intact segments. There were no significant differences between the anchored spacer and locking plate fixation. CONCLUSION: An anchored spacer appears to provide similar biomechanical stability to that of locking plate fixation.
Resumo:
The comprehension of stories requires the reader to imagine the cognitive and affective states of the characters. The content of many stories is unpleasant, as they often deal with conflict, disturbance or crisis. Nevertheless, unpleasant stories can be liked and enjoyed. In this fMRI study, we used a parametric approach to examine (1) the capacity of increasing negative valence of story contents to activate the mentalizing network (cognitive and affective theory of mind, ToM), and (2) the neural substrate of liking negatively valenced narratives. A set of 80 short narratives was compiled, ranging from neutral to negative emotional valence. For each story mean rating values on valence and liking were obtained from a group of 32 participants in a prestudy, and later included as parametric regressors in the fMRI analysis. Another group of 24 participants passively read the narratives in a three Tesla MRI scanner. Results revealed a stronger engagement of affective ToM-related brain areas with increasingly negative story valence. Stories that were unpleasant, but simultaneously liked, engaged the medial prefrontal cortex (mPFC), which might reflect the moral exploration of the story content. Further analysis showed that the more the mPFC becomes engaged during the reading of negatively valenced stories, the more coactivation can be observed in other brain areas related to the neural processing of affective ToM and empathy.
Resumo:
By means of fixed-links modeling, the present study identified different processes of visual short-term memory (VSTM) functioning and investigated how these processes are related to intelligence. We conducted an experiment where the participants were presented with a color change detection task. Task complexity was manipulated through varying the number of presented stimuli (set size). We collected hit rate and reaction time (RT) as indicators for the amount of information retained in VSTM and speed of VSTM scanning, respectively. Due to the impurity of these measures, however, the variability in hit rate and RT was assumed to consist not only of genuine variance due to individual differences in VSTM retention and VSTM scanning but also of other, non-experimental portions of variance. Therefore, we identified two qualitatively different types of components for both hit rate and RT: (1) non-experimental components representing processes that remained constant irrespective of set size and (2) experimental components reflecting processes that increased as a function of set size. For RT, intelligence was negatively associated with the non-experimental components, but was unrelated to the experimental components assumed to represent variability in VSTM scanning speed. This finding indicates that individual differences in basic processing speed, rather than in speed of VSTM scanning, differentiates between high- and low-intelligent individuals. For hit rate, the experimental component constituting individual differences in VSTM retention was positively related to intelligence. The non-experimental components of hit rate, representing variability in basal processes, however, were not associated with intelligence. By decomposing VSTM functioning into non-experimental and experimental components, significant associations with intelligence were revealed that otherwise might have been obscured.