47 resultados para resistive switching
Resumo:
BACKGROUND: Mild perioperative hypothermia increases the risk of several severe complications. Perioperative patient warming to preserve normothermia has thus become routine, with forced-air warming being used most often. In previous studies, various resistive warming systems have shown mixed results in comparison with forced-air. Recently, a polymer-based resistive patient warming system has been developed. We compared the efficacy of a standard forced-air warming system with the resistive polymer system in volunteers. METHODS: Eight healthy volunteers participated, each on two separate study days. Unanesthetized volunteers were cooled to a core temperature (tympanic membrane) of 34 degrees C by application of forced-air at 10 degrees C and a circulating-water mattress at 4 degrees C. Meperidine and buspirone were administered to prevent shivering. In a randomly designated order, volunteers were then rewarmed (until their core temperatures reached 36 degrees C) with one of the following active warming systems: (1) forced-air warming (Bair Hugger warming cover #300, blower #750, Arizant, Eden Prairie, MN); or (2) polymer fiber resistive warming (HotDog whole body blanket, HotDog standard controller, Augustine Biomedical, Eden Prairie, MN). The alternate system was used on the second study day. Metabolic heat production, cutaneous heat loss, and core temperature were measured. RESULTS: Metabolic heat production and cutaneous heat loss were similar with each system. After a 30-min delay, core temperature increased nearly linearly by 0.98 (95% confidence interval 0.91-1.04) degrees C/h with forced-air and by 0.92 (0.85-1.00) degrees C/h with resistive heating (P = 0.4). CONCLUSIONS: Heating efficacy and core rewarming rates were similar with full-body forced-air and full-body resistive polymer heating in healthy volunteers.
Resumo:
BACKGROUND: In high-income countries, viral load is routinely measured to detect failure of antiretroviral therapy (ART) and guide switching to second-line ART. Viral load monitoring is not generally available in resource-limited settings. We examined switching from nonnucleoside reverse transcriptase inhibitor (NNRTI)-based first-line regimens to protease inhibitor-based regimens in Africa, South America and Asia. DESIGN AND METHODS: Multicohort study of 17 ART programmes. All sites monitored CD4 cell count and had access to second-line ART and 10 sites monitored viral load. We compared times to switching, CD4 cell counts at switching and obtained adjusted hazard ratios for switching (aHRs) with 95% confidence intervals (CIs) from random-effects Weibull models. RESULTS: A total of 20 113 patients, including 6369 (31.7%) patients from 10 programmes with access to viral load monitoring, were analysed; 576 patients (2.9%) switched. Low CD4 cell counts at ART initiation were associated with switching in all programmes. Median time to switching was 16.3 months [interquartile range (IQR) 10.1-26.6] in programmes with viral load monitoring and 21.8 months (IQR 14.0-21.8) in programmes without viral load monitoring (P < 0.001). Median CD4 cell counts at switching were 161 cells/microl (IQR 77-265) in programmes with viral load monitoring and 102 cells/microl (44-181) in programmes without viral load monitoring (P < 0.001). Switching was more common in programmes with viral load monitoring during months 7-18 after starting ART (aHR 1.38; 95% CI 0.97-1.98), similar during months 19-30 (aHR 0.97; 95% CI 0.58-1.60) and less common during months 31-42 (aHR 0.29; 95% CI 0.11-0.79). CONCLUSION: In resource-limited settings, switching to second-line regimens tends to occur earlier and at higher CD4 cell counts in ART programmes with viral load monitoring compared with programmes without viral load monitoring.
Resumo:
Sustained growth of solid tumours can rely on both the formation of new and the co-option of existing blood vessels. Current models suggest that binding of angiopoietin-2 (Ang-2) to its endothelial Tie2 receptor prevents receptor phosphorylation, destabilizes blood vessels, and promotes vascular permeability. In contrast, binding of angiopoietin-1 (Ang-1) induces Tie2 receptor activation and supports the formation of mature blood vessels covered by pericytes. Despite the intense research to decipher the role of angiopoietins during physiological neovascularization and tumour angiogenesis, a mechanistic understanding of angiopoietin function on vascular integrity and remodelling is still incomplete. We therefore assessed the vascular morphology of two mouse mammary carcinoma xenotransplants (M6378 and M6363) which differ in their natural angiopoietin expression. M6378 displayed Ang-1 in tumour cells but no Ang-2 in tumour endothelial cells in vivo. In contrast, M6363 tumours expressed Ang-2 in the tumour vasculature, whereas no Ang-1 expression was present in tumour cells. We stably transfected M6378 mouse mammary carcinoma cells with human Ang-1 or Ang-2 and investigated the consequences on the host vasculature, including ultrastructural morphology. Interestingly, M6378/Ang-2 and M6363 tumours displayed a similar vascular morphology, with intratumoural haemorrhage and non-functional and abnormal blood vessels. Pericyte loss was prominent in these tumours and was accompanied by increased endothelial cell apoptosis. Thus, overexpression of Ang-2 converted the vascular phenotype of M6378 tumours into a phenotype similar to M6363 tumours. Our results support the hypothesis that Ang-1/Tie2 signalling is essential for vessel stabilization and endothelial cell/pericyte interaction, and suggest that Ang-2 is able to induce a switch of vascular phenotypes within tumours.
Resumo:
The concept of platform switching has been introduced to implant dentistry based on clinical observations of reduced peri-implant crestal bone loss. However, published data are controversial, and most studies are limited to 12 months. The aim of the present randomized clinical trial was to test the hypothesis that platform switching has a positive impact on crestal bone-level changes after 3 years. Two implants with a diameter of 4 mm were inserted crestally in the posterior mandible of 25 patients. The intraindividual allocation of platform switching (3.3-mm platform) and the standard implant (4-mm platform) was randomized. After 3 months of submerged healing, single-tooth crowns were cemented. Patients were followed up at short intervals for monitoring of healing and oral hygiene. Statistical analysis for the influence of time and platform type on bone levels employed the Brunner-Langer model. At 3 years, the mean radiographic peri-implant bone loss was 0.69 ± 0.43 mm (platform switching) and 0.74 ± 0.57 mm (standard platform). The mean intraindividual difference was 0.05 ± 0.58 mm (95% confidence interval: -0.19, 0.29). Crestal bone-level alteration depended on time (p < .001) but not on platform type (p = .363). The present randomized clinical trial could not confirm the hypothesis of a reduced peri-implant crestal bone loss, when implants had been restored according to the concept of platform switching.
Resumo:
Context: In virologically suppressed, antiretroviral-treated patients, the effect of switching to tenofovir (TDF) on bone biomarkers compared to patients remaining on stable antiretroviral therapy is unknown. Methods: We examined bone biomarkers (osteocalcin [OC], procollagen type 1 amino-terminal propeptide, and C-terminal cross-linking telopeptide of type 1 collagen) and bone mineral density (BMD) over 48 weeks in virologically suppressed patients (HIV RNA < 50 copies/ml) randomized to switch to TDF/emtricitabine (FTC) or remain on first-line zidovudine (AZT)/lamivudine (3TC). PTH was also measured. Between-group differences in bone biomarkers and associations between change in bone biomarkers and BMD measures were assessed by Student's t tests, Pearson correlation, and multivariable linear regression, respectively. All data are expressed as mean (SD), unless otherwise specified. Results: Of 53 subjects (aged 46.0 y; 84.9% male; 75.5% Caucasian), 29 switched to TDF/FTC. There were reductions in total hip and lumbar spine BMD in those switching to TDF/FTC (total hip, TDF/FTC, −1.73 (2.76)% vs AZT/3TC, −0.39 (2.41)%; between-group P = .07; lumbar spine, TDF/FTC, −1.50 (3.49)% vs AZT/3TC, +0.25 (2.82)%; between-group P = .06), but they did not reach statistical significance. Greater declines in lumbar spine BMD correlated with greater increases in OC (r = −0.28; P = .05). The effect of TDF/FTC on bone biomarkers remained significant when adjusted for baseline biomarker levels, gender, and ethnicity. There was no difference in change in PTH levels over 48 weeks between treatment groups (between-group P = .23). All biomarkers increased significantly from weeks 0 to 48 in the switch group, with no significant change in those remaining on AZT/3TC (between-group, all biomarkers, P < .0001). Conclusion: A switch to TDF/FTC compared to remaining on a stable regimen is associated with increases in bone turnover that correlate with reductions in BMD, suggesting that TDF exposure directly affects bone metabolism in vivo.
Resumo:
When switching tasks, if stimuli are presented that contain features that cue two of the tasks in the set (i.e., bivalent stimuli), performance slowing is observed on all tasks. This generalized slowing extends to tasks in the set which have no features in common with the bivalent stimulus and is referred to as the bivalency effect. In previous work, the bivalency effect was invoked by presenting occasionally occurring bivalent stimuli; therefore, the possibility that the generalized slowing is simply due to surprise (as opposed to bivalency) has not yet been discounted. This question was addressed in two task switching experiments where the occasionally occurring stimuli were either bivalent (bivalent version) or merely surprising (surprising version). The results confirmed that the generalized slowing was much greater in the bivalent version of both experiments, demonstrating that the magnitude of this effect is greater than can be accounted for by simple surprise. This set of results confirms that slowing task execution when encountering bivalent stimuli may be fundamental for efficient task switching, as adaptive tuning of response style may serve to prepare the cognitive system for possible future high conflict trials.
Resumo:
The purpose of the present study was to investigate whether amnesic patients show a bivalency effect. The bivalency effect refers to the performance slowing that occurs when switching tasks and bivalent stimuli appear occasionally among univalent stimuli. According to the episodic context binding account, bivalent stimuli create a conflict-loaded context that is re-activated on subsequent trials and thus it is assumed that it depends on memory binding processes. Given the profound memory deficit in amnesia, we hypothesized that the bivalency effect would be largely reduced in amnesic patients. We tested sixteen severely amnesic patients and a control group with a paradigm requiring predictable alternations between three simple cognitive tasks, with bivalent stimuli occasionally occurring on one of these tasks. The results showed the typical bivalency effect for the control group, that is, a generalized slowing for each task. In contrast, for amnesic patients, only a short-lived slowing was present on the task that followed immediately after a bivalent stimulus, indicating that the binding between tasks and context was impaired in amnesic patients.