82 resultados para simultaneous monitoring of process mean and variance
Resumo:
Purpose To determine whether diffusion-weighted (DW) magnetic resonance (MR) imaging in living renal allograft donation allows monitoring of potential changes in the nontransplanted remaining kidney of the donor because of unilateral nephrectomy and changes in the transplanted kidney before and after transplantation in donor and recipient, respectively, and whether DW MR parameters are correlated in the same kidney before and after transplantation. Materials and Methods The study protocol was approved by the local ethics committee; written informed consent was obtained. Thirteen healthy kidney donors and their corresponding recipients prospectively underwent DW MR imaging (multiple b values) in donors before donation and in donors and recipients at day 8 and months 3 and 12 after donation. Total apparent diffusion coefficient (ADCT) values were determined; contribution of microcirculation was quantified in perfusion fraction (FP). Longitudinal changes of diffusion parameters were compared (repeated-measures one-way analysis of variance with post hoc pairwise comparisons). Correlations were tested (linear regression). Results ADCT values in nontransplanted kidney of donors increased from a preexplantation value of (188 ± 9 [standard deviation]) to (202 ± 11) × 10(-5) mm(2)/sec in medulla and from (199 ± 11) to (210 ± 13) × 10(-5) mm(2)/sec in cortex 1 week after donation (P < .004). Medullary, but not cortical, ADCT values stayed increased up to 1 year. ADCT values in allografts in recipients were stable. Compared with values obtained before transplantation in donors, the corticomedullary difference was reduced in allografts (P < .03). Cortical ADCT values correlated with estimated glomerular filtration rate in recipients (R = 0.56, P < .001) but not donors. Cortical ADCT values in the same kidney before transplantation in donors correlated with those in recipients on day 8 after transplantation (R = 0.77, P = .006). FP did not show significant changes. Conclusion DW MR imaging depicts early adaptations in the remaining nontransplanted kidney of donors after nephrectomy. All diffusion parameters remained constant in allograft recipients after transplantation. This method has potential monitoring utility, although assessment of clinical relevance is needed. © RSNA, 2013 Online supplemental material is available for this article.
Resumo:
Epileptic seizures are associated with high behavioral stereotypy of the patients. In the EEG of epilepsy patients characteristic signal patterns can be found during and between seizures. Here we use ordinal patterns to analyze EEGs of epilepsy patients and quantify the degree of signal determinism. Besides relative signal redundancy and the fraction of forbidden patterns we introduce the fraction of under-represented patterns as a new measure. Using the logistic map, parameter scans are performed to explore the sensitivity of the measures to signal determinism. Thereafter, application is made to two types of EEGs recorded in two epilepsy patients. Intracranial EEG shows pronounced determinism peaks during seizures. Finally, we demonstrate that ordinal patterns may be useful for improving analysis of non-invasive simultaneous EEG-fMRI.
Resumo:
PURPOSE Therapeutic drug monitoring of patients receiving once daily aminoglycoside therapy can be performed using pharmacokinetic (PK) formulas or Bayesian calculations. While these methods produced comparable results, their performance has never been checked against full PK profiles. We performed a PK study in order to compare both methods and to determine the best time-points to estimate AUC0-24 and peak concentrations (C max). METHODS We obtained full PK profiles in 14 patients receiving a once daily aminoglycoside therapy. PK parameters were calculated with PKSolver using non-compartmental methods. The calculated PK parameters were then compared with parameters estimated using an algorithm based on two serum concentrations (two-point method) or the software TCIWorks (Bayesian method). RESULTS For tobramycin and gentamicin, AUC0-24 and C max could be reliably estimated using a first serum concentration obtained at 1 h and a second one between 8 and 10 h after start of the infusion. The two-point and the Bayesian method produced similar results. For amikacin, AUC0-24 could reliably be estimated by both methods. C max was underestimated by 10-20% by the two-point method and by up to 30% with a large variation by the Bayesian method. CONCLUSIONS The ideal time-points for therapeutic drug monitoring of once daily administered aminoglycosides are 1 h after start of a 30-min infusion for the first time-point and 8-10 h after start of the infusion for the second time-point. Duration of the infusion and accurate registration of the time-points of blood drawing are essential for obtaining precise predictions.
Resumo:
The aim of this study was to test the effect of cardiac output (CO) and pulmonary artery hypertension (PHT) on volumetric capnography (VCap) derived-variables. Nine pigs were mechanically ventilated using fixed ventilatory settings. Two steps of PHT were induced by IV infusion of a thromboxane analogue: PHT25 [mean pulmonary arterial pressure (MPAP) of 25 mmHg] and PHT40 (MPAP of 40 mmHg). CO was increased by 50 % from baseline (COup) with an infusion of dobutamine ≥5 μg kg(-1) min(-1) and decreased by 40 % from baseline (COdown) infusing sodium nitroglycerine ≥30 μg kg(-1) min(-1) plus esmolol 500 μg kg(-1) min(-1). Another state of PHT and COdown was induced by severe hypoxemia (FiO2 0.07). Invasive hemodynamic data and VCap were recorded and compared before and after each step using a mixed random effects model. Compared to baseline, the normalized slope of phase III (SnIII) increased by 32 % in PHT25 and by 22 % in PHT40. SnIII decreased non-significantly by 4 % with COdown. A combination of PHT and COdown associated with severe hypoxemia increased SnIII by 28 % compared to baseline. The elimination of CO2 per breath decreased by 7 % in PHT40 and by 12 % in COdown but increased only slightly with COup. Dead space variables did not change significantly along the protocol. At constant ventilation and body metabolism, pulmonary artery hypertension and decreases in CO had the biggest effects on the SnIII of the volumetric capnogram and on the elimination of CO2.
Resumo:
This article describes in short sections the use and interpretation of indirect blood pressure measurements, central venous pressure, body temperature, pulse oximetry, end tidal CO2 measurements, pulse and heart rate, urine production and emergency laboratory values. Most of these parameters are directly or indirectly linked to the perfusion of the patient. Optimizing these values are one of the most important goals in emergency and critical care medicine.
Resumo:
An HPLC-DAD method for the quantitative analysis of Δ(9)-tetrahydrocannabinol (THC), Δ(9)-tetrahydrocannabinolic acid-A (THCA-A), cannabidiol (CBD), and cannabinol (CBN) in confiscated cannabis products has been developed, fully validated and applied to analyse seized cannabis products. For determination of the THC content of plant material, this method combines quantitation of THCA-A, which is the inactive precursor of THC, and free THC. Plant material was dried, homogenized and extracted with methanol by ultrasonication. Chromatographic separation was achieved with a Waters Alliance 2695 HPLC equipped with a Merck LiChrospher 60 RP-Select B (5μm) precolumn and a Merck LiChroCart 125-4 LiChrospher 60 RP-Select B (5μm) analytical column. Analytes were detected and quantified using a Waters 2996 photo diode array detector. This method has been accepted by the public authorities of Switzerland (Bundesamt für Gesundheit, Federal Office of Public Health), and has been used to analyse 9092 samples since 2000. Since no thermal decarboxylation of THCA-A occurs, the method is highly reproducible for different cannabis materials. Two calibration ranges are used, a lower one for THC, CBN and CBD, and a higher one for THCA-A, due to its dominant presence in fresh plant material. As provider of the Swiss proficiency test, the robustness of this method has been tested over several years, and homogeneity tests even in the low calibration range (1%) show high precision (RSD≤4.3%, except CBD) and accuracy (bias≤4.1%, except CBN).
Resumo:
Introduction. Erroneous answers in studies on the misinformation effect (ME) can be reduced in different ways. In some studies, ME was reduced by SM questions, warnings, or a low credibility of the source of post-event information (PEI). Results are inconsistent, however. Of course, a participant can deliberately decide to refrain from reporting a critical item only when the difference between the original event and the PEI is distinguishable in principle. We were interested in the question to what extent the influence of erroneous information on a central aspect of the original event can be reduced by different means applied singly or in combination. Method. With a 2 (credibility; high vs. low) x 2 (warning; present vs. absent) between subjects design and an additional control group that received neither misinformation nor a warning (N = 116), we examined the above-mentioned factors’ influence on the ME. Participants viewed a short video of a robbery. The critical item suggested in the PEI was that the victim was given a kick by the perpetrator (which he was actually not). The memory test consisted of a two-forced-choice recognition test followed by a SM test. Results. To our surprise, neither a main effect of erroneous PEI nor a main effect of credibility was found. The error rates for the critical item in the control group (50%) as well as in the high (65%) and low (52%) credibility condition without warning did not significantly differ. A warning about possible misleading information in the PEI significantly reduced the influence of misinformation in both credibility conditions by 32-37%. Using a SM question significantly reduced the error rate too, but only in the high credibility no warning condition. Conclusion and Future Research. Our results show that, contrary to a warning or the use of a SM question, low source credibility did not reduce the ME. The most striking finding was, however, the absence of a main effect of erroneous PEI. Due to the high error rate in the control group, we suspect that the wrong answers might have been caused either by the response format (recognition test) or by autosuggestion possibly promoted by the high schema-consistency of the critical item. First results of a post-study in which we used open-ended questions before the recognition test support the former assumption. Results of a replication of this study using open-ended questions prior to the recognition test will be available by June.
Resumo:
OBJECTIVE In contrast to conventional breast imaging techniques, one major diagnostic benefit of breast magnetic resonance imaging (MRI) is the simultaneous acquisition of morphologic and dynamic enhancement characteristics, which are based on angiogenesis and therefore provide insights into tumor pathophysiology. The aim of this investigation was to intraindividually compare 2 macrocyclic MRI contrast agents, with low risk for nephrogenic systemic fibrosis, in the morphologic and dynamic characterization of histologically verified mass breast lesions, analyzed by blinded human evaluation and a fully automatic computer-assisted diagnosis (CAD) technique. MATERIALS AND METHODS Institutional review board approval and patient informed consent were obtained. In this prospective, single-center study, 45 women with 51 histopathologically verified (41 malignant, 10 benign) mass lesions underwent 2 identical examinations at 1.5 T (mean time interval, 2.1 days) with 0.1-mmol kg doses of gadoteric acid and gadobutrol. All magnetic resonance images were visually evaluated by 2 experienced, blinded breast radiologists in consensus and by an automatic CAD system, whereas the morphologic and dynamic characterization as well as the final human classification of lesions were performed based on the categories of the Breast imaging reporting and data system MRI atlas. Lesions were also classified by defining their probability of malignancy (morpho-dynamic index; 0%-100%) by the CAD system. Imaging results were correlated with histopathology as gold standard. RESULTS The CAD system coded 49 of 51 lesions with gadoteric acid and gadobutrol (detection rate, 96.1%); initial signal increase was significantly higher for gadobutrol than for gadoteric acid for all and the malignant coded lesions (P < 0.05). Gadoteric acid resulted in more postinitial washout curves and fewer continuous increases of all and the malignant lesions compared with gadobutrol (CAD hot spot regions, P < 0.05). Morphologically, the margins of the malignancies were different between the 2 agents, whereas gadobutrol demonstrated more spiculated and fewer smooth margins (P < 0.05). Lesion classifications by the human observers and by the morpho-dynamic index compared with the histopathologic results did not significantly differ between gadoteric acid and gadobutrol. CONCLUSIONS Macrocyclic contrast media can be reliably used for breast dynamic contrast-enhanced MRI. However, gadoteric acid and gadobutrol differed in some dynamic and morphologic characterization of histologically verified breast lesions in an intraindividual, comparison. Besides the standardization of technical parameters and imaging evaluation of breast MRI, the standardization of the applied contrast medium seems to be important to receive best comparable MRI interpretation.
Resumo:
PURPOSE To determine the variability of apparent diffusion coefficient (ADC) values in various anatomic regions in the upper abdomen measured with magnetic resonance (MR) systems from different vendors and with different field strengths. MATERIALS AND METHODS Ten healthy men (mean age, 36.6 years ± 7.7 [standard deviation]) gave written informed consent to participate in this prospective ethics committee-approved study. Diffusion-weighted (DW) MR imaging was performed in each subject with 1.5- and 3.0-T MR systems from each of three vendors at two institutions. Two readers independently measured ADC values in seven upper abdominal regions (left and right liver lobe, gallbladder, pancreas, spleen, and renal cortex and medulla). ADC values were tested for interobserver differences, as well as for differences related to field strength and vendor, with repeated-measures analysis of variance; coefficients of variation (CVs) and variance components were calculated. RESULTS Interreader agreement was excellent (intraclass coefficient, 0.876). ADC values were (77.5-88.8) ×10(-5) mm(2)/sec in the spleen and (250.6-278.5) ×10(-5) mm(2)/sec in the gallbladder. There were no significant differences between ADC values measured at 1.5 T and those measured at 3.0 T in any anatomic region (P >.10 for all). In two of seven regions at 1.5 T (left and right liver lobes, P < .023) and in four of seven regions at 3.0 T (left liver lobe, pancreas, and renal cortex and medulla, P < .008), intervendor differences were significant. CVs ranged from 7.0% to 27.1% depending on the anatomic location. CONCLUSION Despite significant intervendor differences in ADC values of various anatomic regions of the upper abdomen, ADC values of the gallbladder, pancreas, spleen, and kidney may be comparable between MR systems from different vendors and between different field strengths.
Resumo:
A technological development is described through which the stable carbon-, oxygen-, and nonexchangeable hydrogen-isotopic ratios (δ13C,δ18O,δ2H) are determined on a single carbohydrate (cellulose) sample with precision equivalent to conventional techniques (δ13 C 0.15‰,δ18O 0.30‰,δ2H 3.0‰). This triple-isotope approach offers significant new research opportunities, most notably in physiology and medicine, isotope biogeochem- istry, forensic science, and palaeoclimatology, when isotopic analysis of a common sample is desirable or when sample material is limited.
Resumo:
Ischemia/reperfusion injury (IRI) may occur from ischemia due to thrombotic occlusion, trauma or surgical interventions, including transplantation, with subsequent reestablishment of circulation. Time-dependent molecular and structural changes result from the deprivation of blood and oxygen in the affected tissue during ischemia. Upon restoration of blood flow a multifaceted network of plasma cascades is activated, including the complement-, coagulation-, kinin-, and fibrinolytic system, which plays a major role in the reperfusion-triggered inflammatory process. The plasma cascade systems are therefore promising therapeutic targets for attenuation of IRI. Earlier studies showed beneficial effects through inhibition of the complement system using specific complement inhibitors. However, pivotal roles in IRI are also attributed to other cascades. This raises the question, whether drugs, such as C1 esterase inhibitor, which regulate more than one cascade at a time, have a higher therapeutic potential. The present review discusses different therapeutic approaches ranging from specific complement inhibition to simultaneous inhibition of plasma cascade systems for reduction of IRI, gives an overview of the plasma cascade systems in IRI as well as highlights recent findings in this field.
Resumo:
BACKGROUND HIV-1 viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not universally available. We examined monitoring of first-line and switching to second-line ART in sub-Saharan Africa, 2004-2013. METHODS Adult HIV-1 infected patients starting combination ART in 16 countries were included. Switching was defined as a change from a non-nucleoside reverse-transcriptase inhibitor (NNRTI)-based regimen to a protease inhibitor (PI)-based regimen, with a change of ≥1 NRTI. Virological and immunological failures were defined per World Health Organization criteria. We calculated cumulative probabilities of switching and hazard ratios with 95% confidence intervals (CI) comparing routine VL monitoring, targeted VL monitoring, CD4 cell monitoring and clinical monitoring, adjusted for programme and individual characteristics. FINDINGS Of 297,825 eligible patients, 10,352 patients (3·5%) switched during 782,412 person-years of follow-up. Compared to CD4 monitoring hazard ratios for switching were 3·15 (95% CI 2·92-3·40) for routine VL, 1·21 (1·13-1·30) for targeted VL and 0·49 (0·43-0·56) for clinical monitoring. Overall 58.0% of patients with confirmed virological and 19·3% of patients with confirmed immunological failure switched within 2 years. Among patients who switched the percentage with evidence of treatment failure based on a single CD4 or VL measurement ranged from 32·1% with clinical to 84.3% with targeted VL monitoring. Median CD4 counts at switching were 215 cells/µl under routine VL monitoring but lower with other monitoring (114-133 cells/µl). INTERPRETATION Overall few patients switched to second-line ART and switching occurred late in the absence of routine viral load monitoring. Switching was more common and occurred earlier with targeted or routine viral load testing.
Resumo:
Sound knowledge of the spatial and temporal patterns of rockfalls is fundamental for the management of this very common hazard in mountain environments. Process-based, three-dimensional simulation models are nowadays capable of reproducing the spatial distribution of rockfall occurrences with reasonable accuracy through the simulation of numerous individual trajectories on highly-resolved digital terrain models. At the same time, however, simulation models typically fail to quantify the ‘real’ frequency of rockfalls (in terms of return intervals). The analysis of impact scars on trees, in contrast, yields real rockfall frequencies, but trees may not be present at the location of interest and rare trajectories may not necessarily be captured due to the limited age of forest stands. In this article, we demonstrate that the coupling of modeling with tree-ring techniques may overcome the limitations inherent to both approaches. Based on the analysis of 64 cells (40 m × 40 m) of a rockfall slope located above a 1631-m long road section in the Swiss Alps, we illustrate results from 488 rockfalls detected in 1260 trees. We illustrate that tree impact data cannot only be used (i) to reconstruct the real frequency of rockfalls for individual cells, but that they also serve (ii) the calibration of the rockfall model Rockyfor3D, as well as (iii) the transformation of simulated trajectories into real frequencies. Calibrated simulation results are in good agreement with real rockfall frequencies and exhibit significant differences in rockfall activity between the cells (zones) along the road section. Real frequencies, expressed as rock passages per meter road section, also enable quantification and direct comparison of the hazard potential between the zones. The contribution provides an approach for hazard zoning procedures that complements traditional methods with a quantification of rockfall frequencies in terms of return intervals through a systematic inclusion of impact records in trees.