998 resultados para wavelength monitoring
Resumo:
INTRODUCTION: Continuous EEG (cEEG) is increasingly used to monitor brain function in neuro-ICU patients. However, its value in patients with coma after cardiac arrest (CA), particularly in the setting of therapeutic hypothermia (TH), is only beginning to be elucidated. The aim of this study was to examine whether cEEG performed during TH may predict outcome. METHODS: From April 2009 to April 2010, we prospectively studied 34 consecutive comatose patients treated with TH after CA who were monitored with cEEG, initiated during hypothermia and maintained after rewarming. EEG background reactivity to painful stimulation was tested. We analyzed the association between cEEG findings and neurologic outcome, assessed at 2 months with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). RESULTS: Continuous EEG recording was started 12 ± 6 hours after CA and lasted 30 ± 11 hours. Nonreactive cEEG background (12 of 15 (75%) among nonsurvivors versus none of 19 (0) survivors; P < 0.001) and prolonged discontinuous "burst-suppression" activity (11 of 15 (73%) versus none of 19; P < 0.001) were significantly associated with mortality. EEG seizures with absent background reactivity also differed significantly (seven of 15 (47%) versus none of 12 (0); P = 0.001). In patients with nonreactive background or seizures/epileptiform discharges on cEEG, no improvement was seen after TH. Nonreactive cEEG background during TH had a positive predictive value of 100% (95% confidence interval (CI), 74 to 100%) and a false-positive rate of 0 (95% CI, 0 to 18%) for mortality. All survivors had cEEG background reactivity, and the majority of them (14 (74%) of 19) had a favorable outcome (CPC 1 or 2). CONCLUSIONS: Continuous EEG monitoring showing a nonreactive or discontinuous background during TH is strongly associated with unfavorable outcome in patients with coma after CA. These data warrant larger studies to confirm the value of continuous EEG monitoring in predicting prognosis after CA and TH.
Resumo:
Both N excess and deficiency may affect cotton yield and quality. It would therefore be useful to base the N management fertilization on the monitoring of the nutritional status. This study investigated the correlations among the following determination methods of the N nutritional status of cotton (Gossypium hirsutum L., var. Latifolia): chlorophyll readings (SPAD-502®, Minolta), specific-ion nitrate meter (Nitrate Meter C-141, Horiba-Cardy®), and laboratory analysis (conventional foliar diagnosis). Samples were taken weekly from two weeks before flowering to the fifth week after the first flower. The experiment was conducted on the Fazenda Santa Tereza, Itapeva, State of São Paulo, Brazil. The crop was fertilized with 40 kg ha-1 N at planting and 0, 30, 60, 90, and 120 kg ha-1 of side-dressed N. The range of leaf N contents reported as adequate for samples taken 80-90 days after plant emergence (traditional foliar diagnosis) may be used as reference from the beginning of flowering when the plant is not stressed. Specific-ion nitrate meter readings can be used as a nutritional indicator of cotton nutrition from one week after pinhead until the third week of flowering. In this case, plants are well-nourished when readings exceed 8,000 mg L-1 NO3-. The chlorophyll meter can also be used to estimate the nutritional status of cotton from the third week of flowering. In this case the readings should be above 48 in well-nourished plants.
Resumo:
Drug development has improved over recent decades, with refinements in analytical techniques, population pharmacokinetic-pharmacodynamic (PK-PD) modelling and simulation, and new biomarkers of efficacy and tolerability. Yet this progress has not yielded improvements in individualization of treatment and monitoring, owing to various obstacles: monitoring is complex and demanding, many monitoring procedures have been instituted without critical assessment of the underlying evidence and rationale, controlled clinical trials are sparse, monitoring procedures are poorly validated and both drug manufacturers and regulatory authorities take insufficient account of the importance of monitoring. Drug concentration and effect data should be increasingly collected, analyzed, aggregated and disseminated in forms suitable for prescribers, along with efficient monitoring tools and evidence-based recommendations regarding their best use. PK-PD observations should be collected for both novel and established critical drugs and applied to observational data, in order to establish whether monitoring would be suitable. Methods for aggregating PK-PD data in systematic reviews should be devised. Observational and intervention studies to evaluate monitoring procedures are needed. Miniaturized monitoring tests for delivery at the point of care should be developed and harnessed to closed-loop regulated drug delivery systems. Intelligent devices would enable unprecedented precision in the application of critical treatments, i.e. those with life-saving efficacy, narrow therapeutic margins and high interpatient variability. Pharmaceutical companies, regulatory agencies and academic clinical pharmacologists share the responsibility of leading such developments, in order to ensure that patients obtain the greatest benefit and suffer the least harm from their medicines.
Resumo:
The Iowa EHDI High-Risk Monitoring Protocol is based on the Joint Committee on Infant Hearing 2007 position statement. Emphasis is placed on follow-up as deemed appropriate by the primary health care provider and audiologist. The Iowa protocol describes the follow-up process for children with risk factors.
Resumo:
Background: Recent data have suggested that a population of CD4+ CD25high T cells, phenotypically characterized by the expression of CD45RO and CD127, is significantly expanded in stable liver and kidney transplant recipients and represents alloreactive T cells. Induction therapies may have an impact on this alloreactive T cell population. In this study, we prospectively analyzed CD4+ CD25high CD45RO+ CD127high T cells after induction with either thymoglobulin or basiliximab. Patients and methods: A total of twenty-seven kidney transplant recipients were prospectively enrolled; 14 received thymoglobulin induction followed by a 4-day course of steroids with tacrolimus and mycophenolate mofetil («thymo group»), and 13 received basiliximab induction followed by standard triple immunosuppression (tacrolimus, mycophenolate mofetil and prednisone) («BSX group»). Phenotypical analysis by flow cytometry of the expression of CD25, CD45RO and CD127 on peripheral CD4+ T cells was performed at 0, 3 and 6 months after transplantation. Twenty-four healthy subjects (HS) were studied as controls. Results: There were no differences in baseline characteristics between the groups; at 6 months, patient survival (100%), graft survival (100%), serum creatinine (thymo group versus BSX group: 129 versus 125 micromol/l) and acute rejection (2/14 versus 2/13) were not significantly different. Thymo induction produced a prolonged CD4 T cell depletion. As compared to pre-transplantation values, an expansion of the alloreactive T cell population was observed at 3 months in both thymo (mean: from 6.38% to 14.72%) and BSX (mean: from 8.01% to 18.42%) groups. At 6 months, the alloreactive T cell population remained significantly expanded in the thymo group (16.92 ± 2.87%) whereas it tended to decrease in the BSX group (10.22 ± 1.38%). Conclusion: Overall, our results indicate that the expansion of alloreactive T cells occurs rapidly after transplantation in patients receiving either thymo or BSX induction. Whether differences at later timepoints or whether different IS regimens may modify this alloreactive population remains to be studied.
Resumo:
From data collected during routine TDM, plasma concentrations of citalopram (CIT) and its metabolites demethylcitalopram (DCIT) and didemethylcitalopram (DDCIT) were measured in 345 plasma samples collected in steady-state conditions. They were from 258 patients treated with usual doses (20-60 mg/d) and from patients medicated with 80-360 mg/d CIT. Most patients had one or several comedications, including other antidepressants, antipsychotics, lithium, anticonvulsants, psychostimulants and somatic medications. Dose-corrected CIT plasma concentrations (C/D ratio) were 2.51 +/- 2.25 ng mL-1 mg-1 (n = 258; mean +/- SD). Patients >65 years had significantly higher dose-corrected CIT plasma concentrations (n = 56; 3.08 +/- 1.35 ng mL-1 mg-1) than younger patients (n = 195; 2.35 +/- 2.46 ng mL-1 mg-1) (P = 0.03). CIT plasma concentrations in the generally recommended dose range were [mean +/- SD, (median)]: 57 +/- 64 (45) ng/mL (10-20 mg/d; n = 64), 117 +/- 95 (91) ng/mL (21-60 mg/d; n = 96). At higher than usual doses, the following concentrations of CIT were measured: 61-120 mg/d CIT, 211 +/- 103 (190) ng/mL (n = 93); 121-200 mg/d: 339 +/- 143 (322) ng/mL (n = 70); 201-280 mg/d: 700 +/- 408 (565) ng/mL (n = 18); 281-360 mg/d: 888 +/- 620 (616) ng/mL (n = 4). When only one sample per patient (at the highest daily dose if repeated dosages) is considered, there is a linear and significant correlation (n = 48, r = 0.730; P < 0.001) between daily dose (10-200 mg/d) and CIT plasma concentrations. In experiments with dogs, DDCIT was reported to affect the QT interval when present at concentrations >300 ng/mL. In this study, DDCIT concentration reached 100 ng/mL in a patient treated with 280 mg/d CIT. Twelve other patients treated with 140-320 mg/d CIT had plasma concentrations of DDCIT within the range 52-73 ng/mL. In a subgroup comprised of patients treated with > or =160 mg/d CIT and with CIT plasma concentrations < or =300 ng/mL, and patients treated with < or =200 mg/d CIT and CIT plasma concentrations > or = 600 ng/mL, the enantiomers of CIT and DCIT were also analyzed. The highest S-CIT concentration measured in this subgroup was 327 ng/mL in a patient treated with 140 mg/d CIT, but the highest S-CIT concentration (632 ng/mL) was measured in patient treated with 360 mg/d CIT. In conclusion, there is a highly linear correlation between CIT plasma concentrations and CIT doses, well above the usual dose range.
Resumo:
Numerous phase I and II clinical trials testing the safety and immunogenicity of various peptide vaccine formulations based on CTL-defined tumor antigens in cancer patients have been reported during the last 7 years. While specific T-cell responses can be detected in a variable fraction of immunized patients, an even smaller but significant fraction of these patients have objective tumor responses. Efficient therapeutic vaccination should aim at boosting naturally occurring antitumor T- and B-cell responses and at sustaining a large number of tumor antigen specific and fully functional effector T cells at tumor sites. Recent progress in our ability to quantitatively and qualitatively monitor tumor antigen specific CD8 T-cell responses will greatly help in making rapid progress in this field.
Resumo:
The main objective of this study was to evaluate the hydraulic performance of riprap spurs and weirs in controlling bank erosion at the Southern part of the Raccoon River upstream U.S. Highway 169 Bridge utilizing the commercially available model FESWMS and field monitoring. It was found based on a 2 year monitoring and numerical modeling that the design of structures was overall successful, including their spacing and stability. The riprap material incorporated into the structures was directly and favorably correlated to the flow transmission through the structure, or in other words, dictated the permeable nature of the structure. It was found that the permeable dikes and weirs chosen in this study created less volume of scour in the vicinity of the structure toes and thus have less risk comparatively to other impermeable structures to collapse. The fact that the structures permitted the transmission of flow through them it allowed fine sand particles to fill in the gaps of the rock interstices and thus cement and better stabilize the structures. During bank-full flows the maximum scour hole was recorded away from the structures toe and the scourhole size was directly related to the protrusion angle of the structure to the flow. It was concluded that the proposed structure inclination with respect to the main flow direction was appropriate since it provides maximum bank protection while creating the largest volume of local scour away from the structure and towards the center of the channel. Furthermore, the lowest potential for bank erosion also occurs with the present set-up design chosen by the IDOT. About 2 ft of new material was deposited in the area located between the structures for the period extending from the construction day to May 2007. Surveys obtained by sonar and the presence of vegetation indicate that new material has been added at the bank toes. Finally, the structures provided higher variability in bed topography forming resting pools, creating flow shade on the leeward side of the structure, and separation of bed substrate due to different flow conditions. Another notable environmental benefit to rock riprap weirs and dikes is the creation of resting pools, especially in year 2007 (2nd year of the project). The magnitude of these benefits to aquatic habitat has been found in the literature that is directly related to the induced scour-hole volume.
Roadway Lighting and Safety: Phase II – Monitoring Quality, Durability and Efficiency, November 2011
Resumo:
This Phase II project follows a previous project titled Strategies to Address Nighttime Crashes at Rural, Unsignalized Intersections. Based on the results of the previous study, the Iowa Highway Research Board (IHRB) indicated interest in pursuing further research to address the quality of lighting, rather than just the presence of light, with respect to safety. The research team supplemented the literature review from the previous study, specifically addressing lighting level in terms of measurement, the relationship between light levels and safety, and lamp durability and efficiency. The Center for Transportation Research and Education (CTRE) teamed with a national research leader in roadway lighting, Virginia Tech Transportation Institute (VTTI) to collect the data. An integral instrument to the data collection efforts was the creation of the Roadway Monitoring System (RMS). The RMS allowed the research team to collect lighting data and approach information for each rural intersection identified in the previous phase. After data cleanup, the final data set contained illuminance data for 101 lighted intersections (of 137 lighted intersections in the first study). Data analysis included a robust statistical analysis based on Bayesian techniques. Average illuminance, average glare, and average uniformity ratio values were used to classify quality of lighting at the intersections.