946 resultados para Monitor
Resumo:
PURPOSE: To correlate the dimension of the visual field (VF) tested by Goldman kinetic perimetry with the extent of visibility of the highly reflective layer between inner and outer segments of photoreceptors (IOS) seen in optical coherence tomography (OCT) images in patients with retinitis pigmentosa (RP). METHODS: In a retrospectively designed cross-sectional study, 18 eyes of 18 patients with RP were examined with OCT and Goldmann perimetry using test target I4e and compared with 18 eyes of 18 control subjects. A-scans of raw scan data of Stratus OCT images (Carl Zeiss Meditec, AG, Oberkochen, Germany) were quantitatively analyzed for the presence of the signal generated by the highly reflective layer between the IOS in OCT images. Starting in the fovea, the distance to which this signal was detectable was measured. Visual fields were analyzed by measuring the distance from the center point to isopter I4e. OCT and visual field data were analyzed in a clockwise fashion every 30 degrees , and corresponding measures were correlated. RESULTS: In corresponding alignments, the distance from the center point to isopter I4e and the distance to which the highly reflective signal from the IOS can be detected correlate significantly (r = 0.75, P < 0.0001). The greater the distance in VF, the greater the distance measured in OCT. CONCLUSIONS: The authors hypothesize that the retinal structure from which the highly reflective layer between the IOS emanates is of critical importance for visual and photoreceptor function. Further research is warranted to determine whether this may be useful as an objective marker of progression of retinal degeneration in patients with RP.
Resumo:
BACKGROUND: The aim of this study was to determine the performance of a new, 3D-monitor based, objective stereotest in children under the age of four. METHODS: Random-dot circles (diameter 10 cm, crossed, disparity of 0.34 degrees) randomly changing their position were presented on an 3D-monitor while eye movements were monitored by infrared photo-oculography. If > or = 3 consecutive stimuli were seen, a positive response was assumed. One hundred thirty-four normal children aged 2 months to 4 years (average 17+/-15.3 months) were examined. RESULTS: Below the age of 12 months, we were not able to obtain a response to the 3D stimulus. For older children the following rates of positive responses were found: 12-18 months 25%, 18-24 months 10%, 24-30 months 16%, 30-36 months 57%, 36-42 months 100%, and 42-48 months 91%. Multiple linear logistic regression showed a significant influence on stimulus recognition of the explanatory variables age (p<0.00001) and child cooperation (p<0.001), but not of gender (p>0.1). CONCLUSIONS: This 3D-monitor based stereotest allows an objective measurement of random-dot stereopsis in younger children. It might open new ways to screen children for visual abnormalities and to study the development of stereovision. However, the current experimental setting does not allow determining random-dot stereopsis in children younger than 12 months.
Resumo:
We used the Green's functions from auto-correlations and cross-correlations of seismic ambient noise to monitor temporal velocity changes in the subsurface at Villarrica volcano in the Southern Andes of Chile. Campaigns were conducted from March to October 2010 and February to April 2011 with 8 broadband and 6 short-period stations, respectively. We prepared the data by removing the instrument response, normalizing with a root-mean-square method, whitening the spectra, and filtering from 1 to 10 Hz. This frequency band was chosen based on the relatively high background noise level in that range. Hour-long auto- and cross-correlations were computed and the Green's functions stacked by day and total time. To track the temporal velocity changes we stretched a 24 hour moving window of correlation functions from 90% to 110% of the original and cross correlated them with the total stack. All of the stations' auto-correlations detected what is interpreted as an increase in velocity in 2010, with an average increase of 0.13%. Cross-correlations from station V01, near the summit, to the other stations show comparable changes that are also interpreted as increases in velocity. We attribute this change to the closing of cracks in the subsurface due either to seasonal snow loading or regional tectonics. In addition to the common increase in velocity across the stations, there are excursions in velocity on the same order lasting several days. Amplitude decreases as the station's distance from the vent increases suggesting these excursions may be attributed to changes within the volcanic edifice. In at least two occurrences the amplitudes at stations V06 and V07, the stations farthest from the vent, are smaller. Similar short temporal excursions were seen in the auto-correlations from 2011, however, there was little to no increase in the overall velocity.
Resumo:
Semi-natural grasslands, biodiversity hotspots in Central-Europe, suffer from the cessation of traditional land-use. Amount and intensity of these changes challenge current monitoring frameworks typically based on classic indicators such as selected target species or diversity indices. Indicators based on plant functional traits provide an interesting extension since they reflect ecological strategies at individual and ecological processes at community levels. They typically show convergent responses to gradients of land-use intensity over scales and regions, are more directly related to environmental drivers than diversity components themselves and enable detecting directional changes in whole community dynamics. However, probably due to their labor- and cost intensive assessment in the field, they have been rarely applied as indicators so far. Here we suggest overcoming these limitations by calculating indicators with plant traits derived from online accessible databases. Aiming to provide a minimal trait set to monitor effects of land-use intensification on plant diversity we investigated relationships between 12 community mean traits, 2 diversity indices and 6 predictors of land-use intensity within grassland communities of 3 different regions in Germany (part of the German ‘Biodiversity Exploratory’ research network). By standardization of traits and diversity measures, use of null models and linear mixed models we confirmed (i) strong links between functional community composition and plant diversity, (ii) that traits are closely related to land-use intensity, and (iii) that functional indicators are equally, or even more sensitive to land-use intensity than traditional diversity indices. The deduced trait set consisted of 5 traits, i.e., specific leaf area (SLA), leaf dry matter content (LDMC), seed release height, leaf distribution, and onset of flowering. These database derived traits enable the early detection of changes in community structure indicative for future diversity loss. As an addition to current monitoring measures they allow to better link environmental drivers to processes controlling community dynamics.
Resumo:
Background: WHO's 2013 revisions to its Consolidated Guidelines on antiretroviral drugs recommend routine viral load monitoring, rather than clinical or immunological monitoring, as the preferred monitoring approach on the basis of clinical evidence. However, HIV programmes in resource-limited settings require guidance on the most cost-effective use of resources in view of other competing priorities such as expansion of antiretroviral therapy coverage. We assessed the cost-effectiveness of alternative patient monitoring strategies. Methods: We evaluated a range of monitoring strategies, including clinical, CD4 cell count, and viral load monitoring, alone and together, at different frequencies and with different criteria for switching to second-line therapies. We used three independently constructed and validated models simultaneously. We estimated costs on the basis of resource use projected in the models and associated unit costs; we quantified impact as disability-adjusted life years (DALYs) averted. We compared alternatives using incremental cost-effectiveness analysis. Findings: All models show that clinical monitoring delivers significant benefit compared with a hypothetical baseline scenario with no monitoring or switching. Regular CD4 cell count monitoring confers a benefit over clinical monitoring alone, at an incremental cost that makes it affordable in more settings than viral load monitoring, which is currently more expensive. Viral load monitoring without CD4 cell count every 6—12 months provides the greatest reductions in morbidity and mortality, but incurs a high cost per DALY averted, resulting in lost opportunities to generate health gains if implemented instead of increasing antiretroviral therapy coverage or expanding antiretroviral therapy eligibility. Interpretation: The priority for HIV programmes should be to expand antiretroviral therapy coverage, firstly at CD4 cell count lower than 350 cells per μL, and then at a CD4 cell count lower than 500 cells per μL, using lower-cost clinical or CD4 monitoring. At current costs, viral load monitoring should be considered only after high antiretroviral therapy coverage has been achieved. Point-of-care technologies and other factors reducing costs might make viral load monitoring more affordable in future. Funding: Bill & Melinda Gates Foundation, WHO.