889 resultados para Step count
Resumo:
The mušlālu, which is usually understood to be a step- or ramped gate, has so far rarely been discussed in much detail, even though the texts mentioning it have been known for a long time. This is no doubt because the evidence on which to base both philological and archaeological investigations appears sparse and scattered. Up to now, scholars have tended to focus on specific mušlālu-structures. Consequently, there are as many discussions as there are mušlālu-structures. This article attempts to get a sense of the broader picture by bringing together the philological and archaeological data hitherto available with the aim of discussing the mušlālu as term and structure, clarifying the semantics and reassessing the architectural identifications. We hope to demonstrate that it is actually possible to take the evidence one step further and correct the mušlālu image that prevails.
Resumo:
BACKGROUND Antiretroviral therapy (ART) initiation is now recommended irrespective of CD4 count. However data on the relationship between CD4 count at ART initiation and loss to follow-up (LTFU) are limited and conflicting. METHODS We conducted a cohort analysis including all adults initiating ART (2008-2012) at three public sector sites in South Africa. LTFU was defined as no visit in the 6 months before database closure. The Kaplan-Meier estimator and Cox's proportional hazards models examined the relationship between CD4 count at ART initiation and 24-month LTFU. Final models were adjusted for demographics, year of ART initiation, programme expansion and corrected for unascertained mortality. RESULTS Among 17 038 patients, the median CD4 at initiation increased from 119 (IQR 54-180) in 2008 to 257 (IQR 175-318) in 2012. In unadjusted models, observed LTFU was associated with both CD4 counts <100 cells/μL and CD4 counts ≥300 cells/μL. After adjustment, patients with CD4 counts ≥300 cells/μL were 1.35 (95% CI 1.12 to 1.63) times as likely to be LTFU after 24 months compared to those with a CD4 150-199 cells/μL. This increased risk for patients with CD4 counts ≥300 cells/μL was largest in the first 3 months on treatment. Correction for unascertained deaths attenuated the association between CD4 counts <100 cells/μL and LTFU while the association between CD4 counts ≥300 cells/μL and LTFU persisted. CONCLUSIONS Patients initiating ART at higher CD4 counts may be at increased risk for LTFU. With programmes initiating patients at higher CD4 counts, models of ART delivery need to be reoriented to support long-term retention.
Resumo:
OBJECTIVE To illustrate an approach to compare CD4 cell count and HIV-RNA monitoring strategies in HIV-positive individuals on antiretroviral therapy (ART). DESIGN Prospective studies of HIV-positive individuals in Europe and the USA in the HIV-CAUSAL Collaboration and The Center for AIDS Research Network of Integrated Clinical Systems. METHODS Antiretroviral-naive individuals who initiated ART and became virologically suppressed within 12 months were followed from the date of suppression. We compared 3 CD4 cell count and HIV-RNA monitoring strategies: once every (1) 3 ± 1 months, (2) 6 ± 1 months, and (3) 9-12 ± 1 months. We used inverse-probability weighted models to compare these strategies with respect to clinical, immunologic, and virologic outcomes. RESULTS In 39,029 eligible individuals, there were 265 deaths and 690 AIDS-defining illnesses or deaths. Compared with the 3-month strategy, the mortality hazard ratios (95% CIs) were 0.86 (0.42 to 1.78) for the 6 months and 0.82 (0.46 to 1.47) for the 9-12 month strategy. The respective 18-month risk ratios (95% CIs) of virologic failure (RNA >200) were 0.74 (0.46 to 1.19) and 2.35 (1.56 to 3.54) and 18-month mean CD4 differences (95% CIs) were -5.3 (-18.6 to 7.9) and -31.7 (-52.0 to -11.3). The estimates for the 2-year risk of AIDS-defining illness or death were similar across strategies. CONCLUSIONS Our findings suggest that monitoring frequency of virologically suppressed individuals can be decreased from every 3 months to every 6, 9, or 12 months with respect to clinical outcomes. Because effects of different monitoring strategies could take years to materialize, longer follow-up is needed to fully evaluate this question.
Resumo:
Leukopenia, the leukocyte count, and prognosis of disease are interrelated; a systematic search of the literature was undertaken to ascertain the strength of the evidence. One hundred seventy-one studies were found from 1953 onward pertaining to the predictive capabilities of the leukocyte count. Of those studies, 42 met inclusion criteria. An estimated range of 2,200cells/μL to 7,000cells/μL was determined as that which indicates good prognosis in disease and indicates the least amount of risk to an individual overall. Tables of the evidence are included indicating the disparate populations examined and the possible degree of association. ^
Resumo:
Patients who had started HAART (Highly Active Anti-Retroviral Treatment) under previous aggressive DHHS guidelines (1997) underwent a life-long continuous HAART that was associated with many short term as well as long term complications. Many interventions attempted to reduce those complications including intermittent treatment also called pulse therapy. Many studies were done to study the determinants of rate of fall in CD4 count after interruption as this data would help guide treatment interruptions. The data set used here was a part of a cohort study taking place at the Johns Hopkins AIDS service since January 1984, in which the data were collected both prospectively and retrospectively. The patients in this data set consisted of 47 patients receiving via pulse therapy with the aim of reducing the long-term complications. ^ The aim of this project was to study the impact of virologic and immunologic factors on the rate of CD4 loss after treatment interruption. The exposure variables under investigation included CD4 cell count and viral load at treatment initiation. The rates of change of CD4 cell count after treatment interruption was estimated from observed data using advanced longitudinal data analysis methods (i.e., linear mixed model). Using random effects accounted for repeated measures of CD4 per person after treatment interruption. The regression coefficient estimates from the model was then used to produce subject specific rates of CD4 change accounting for group trends in change. The exposure variables of interest were age, race, and gender, CD4 cell counts and HIV RNA levels at HAART initiation. ^ The rate of fall of CD4 count did not depend on CD4 cell count or viral load at initiation of treatment. Thus these factors may not be used to determine who can have a chance of successful treatment interruption. CD4 and viral load were again studied by t-tests and ANOVA test after grouping based on medians and quartiles to see any difference in means of rate of CD4 fall after interruption. There was no significant difference between the groups suggesting that there was no association between rate of fall of CD4 after treatment interruption and above mentioned exposure variables. ^
Resumo:
Purpose. This project was designed to describe the association between wasting and CD4 cell counts in HIV-infected men in order to better understand the role of wasting in progression of HIV infection.^ Methods. Baseline and prevalence data were collected from a cross-sectional survey of 278 HIV-infected men seen at the Houston Veterans Affairs Medical Center Special Medicine Clinic, from June 1, 1991 to January 1, 1994. A follow-up study was conducted among those at risk, to investigate the incidence of wasting and the association between wasting and low CD4 cell counts. Wasting was described by four methods. Z-scores for age-, sex-, and height-adjusted weight; sex-, and age-adjusted mid-arm muscle circumference (MAMC); and fat-free mass; and the ratio of extra-cellular mass (ECM) to body-cell mass (BCM) $>$ 1.20. FFM, ECM, and BCM were estimated from bioelectrical impedance analysis. MAMC was calculated from triceps skinfold and mid-arm circumference. The relationship between wasting and covariates was examined with logistic regression in the cross-sectional study, and with Poisson regression in the follow-up study. The association between death and wasting was examined with Cox's regression.^ Results. The prevalence of wasting ranged from 5% (weight and ECM:BCM) to almost 14% (MAMC and FFM) among the 278 men examined. The odds of wasting, associated with baseline CD4 cell count $<$200, was significant for each method but weight, and ranged from 4.6 to 12.7. Use of antiviral therapy was significantly protective of MAMC, FFM and ECM:BCM (OR $\approx$ 0.2), whereas the need for antibacterial therapy was a risk (OR 3.1, 95% CI 1.1-8.7). The average incidence of wasting ranged from 4 to 16 per 100 person-years among the approximately 145 men followed for 160 person-years. Low CD4 cell count seemed to increase the risk of wasting, but statistical significance was not reached. The effect of the small sample size on the power to detect a significant association should be considered. Wasting, by MAMC and FFM, was significantly associated with death, after adjusting for baseline serum albumin concentration and CD4 cell count.^ Conclusions. Wasting by MAMC and FFM were strongly associated with baseline CD4 cell counts in both the prevalence and incidence study and strong predictors of death. Of the two methods, MAMC is convenient, has available reference population data, may be the most appropriate for assessing the nutritional status of HIV-infected men. ^
Resumo:
Early and accurate detection of TB disease in HIV-infected individuals is a critical step for a successful TB program. In Vietnam, the diagnosis of TB disease, which is based predominantly on the clinical examination, chest radiography (CXR) and acid fast bacilli (AFB) sputum smear, has shown to be of low sensitivity in immunocompromised patients. The sputum culture is not routinely performed for patients with AFB negative smears, even in HIV-infected individuals.^ In that background, we conducted this cross-sectional study to estimate the prevalence of sputum culture-confirmed pulmonary tuberculosis (PTB), smear-negative PTB, and multidrug-resistant TB (MDR-TB) in the HIV-infected population in Ho Chi Minh City (HCMC), the largest city in Vietnam where both TB and HIV are highly prevalent. We also evaluated the diagnostic performance of various algorithms based on routine available tools in Vietnam such as symptoms screening, CXR, and AFB smear. Nearly 400 subjects were consecutively recruited from HIV-infected patients seeking care at the An Hoa Clinic in District 6 of Ho Chi Minh City from August 2009 through June 2010. Participants’ demographic data, clinical status, CXR, and laboratory results were collected. A multiple logistic regression model was developed to assess the association of covariates and PTB. ^ The prevalence of smear-positive TB, smear-negative TB, resistant TB, and MDR-TB were 7%, 2%, 5%, 2.5%, and 0.3%, respectively. Adjusted odds ratios for low CD4+ cell count, positive sputum smear, and CXR to positive sputum culture were 3.17, 32.04, and 4.28, respectively. Clinical findings alone had poor sensitivity, but the combination of CD4+ cell count, sputum smear, and CXR proved to perform a more accurate diagnosis.^ This study results support the routine use of sputum culture to improve the detection of TB disease in HIV-infected individuals in Vietnam. When routine sputum culture is not available, an algorithm combining CD4+ cell count, sputum smear, and CXR is recommended for diagnosing PTB. Future studies on more affordable, rapid, and accurate tests for TB infection would also be necessary to timely provide specific treatments for patients in need, reduce mortality, and minimize TB transmission to the general population.^
Resumo:
Invited commentary on "Computerizing Social-Emotional Assessment for School Readiness".
Resumo:
CHARACTERIZATION OF THE COUNT RATE PERFORMANCE AND EVALUATION OF THE EFFECTS OF HIGH COUNT RATES ON MODERN GAMMA CAMERAS Michael Stephen Silosky, B.S. Supervisory Professor: S. Cheenu Kappadath, Ph.D. Evaluation of count rate performance (CRP) is an integral component of gamma camera quality assurance and measurement of system dead time (τ) is important for quantitative SPECT. The CRP of three modern gamma cameras was characterized using established methods (Decay and Dual Source) under a variety of experimental conditions. For the Decay method, input count rate was plotted against observed count rate and fit to the paralyzable detector model (PDM) to estimate τ (Rates method). A novel expression for observed counts as a function of measurement time interval was derived and the observed counts were fit to this expression to estimate τ (Counts method). Correlation and Bland-Altman analysis were performed to assess agreement in estimates of τ between methods. The dependencies of τ on energy window definition and incident energy spectrum were characterized. The Dual Source method was also used to estimate τ and its agreement with the Decay method under identical conditions and the effects of total activity and the ratio of source activities were investigated. Additionally, the effects of count rate on several performance metrics were evaluated. The CRP curves for each system agreed with the PDM at low count rates but deviated substantially at high count rates. Estimates of τ for the paralyzable portion of the CRP curves using the Rates and Counts methods were highly correlated (r=0.999) but with a small (~6%) difference. No significant difference was observed between the highly correlated estimates of τ using the Decay or Dual Source methods under identical experimental conditions (r=0.996). Estimates of τ increased as a power-law function with decreasing ratio of counts in the photopeak to the total counts and linearly with decreasing spectral effective energy. Dual Source method estimates of τ varied as a quadratic with the ratio of the single source to combined source activities and linearly with total activity used across a large range. Image uniformity, spatial resolution, and energy resolution degraded linearly with count rate and image distorting effects were observed. Guidelines for CRP testing and a possible method for the correction of count rate losses for clinical images have been proposed.
Resumo:
Geochemical and mineralogical proxies for paleoenvironmental conditions have the underlying assumption that climate variations have an impact on terrestrial weathering conditions. Varying properties of terrigenous sediments deposited at sea are therefore often interpreted in terms of paleoenvironmental change. Also in gravity core GeoB9307-3 (18° 33.99' S, 37° 22.89' E), located off the Zambezi River, environmental changes during Heinrich Stadial 1 (HS 1) and the Younger Dryas (YD) are accompanied by changing properties of the terrigenous sediment fraction. Our study focuses on the relationship of variability in the hydrological system and changes in the magnetic properties, major element geochemistry and granulometry of the sediments. We propose that changes in bulk sedimentary properties concur with environmental change, although not as a direct response of climate driven pedogenic processes. Spatial varying rainfall intensities on a sub-basin scale modify sediment export from different parts of the Zambezi River basin. During humid phases, such as HS 1 and the YD, sediment was mainly exported from the coastal areas, while during more arid phases sediments mirror the hinterland soil and lithological properties and are likely derived from the northern Shire sub-basin. We propose that a de-coupling of sedimentological and organic signals with variable discharge and erosional activity can occur.
Resumo:
Lake La Thuile, in the Northern French Prealps (874 m a.s.l.), provides an 18 m long sedimentary sequence spanning the entire Lateglacial/Holocene period. The high resolution multi-proxy (sedimentological, palynological, geochemical) analysis of the uppermost 6.2 meters reveals the Holocene dynamics of erosion in the catchment in response to landscape modifications. The mountain belt is at relevant altitude to study past human activities and the watershed is sufficiently disconnected from large valleys to capture a local sedimentary signal. From 12,000 to 10,000 cal. BP (10 to 8 ka cal. BC), the onset of hardwood species triggered a drop in erosion following the Lateglacial/Holocene transition. From 10,000 to 4500 cal. BP (8 to 2.5 ka cal. BC), the forest became denser and favored slope stabilization while erosion processes were very weak. A first erosive phase was initiated at ca . 4500 cal. BP without evidence of human presence in the catchment. Then, the forest declined at approximately 3000 cal. BP, suggesting the first human influence on the landscape. Two other erosive phases are related to anthropic activities: approximately 2500 cal. BP (550 cal. BC) during the Roman period and after 1600 cal. BP (350 cal. AD) with a substantial accentuation in the Middle Ages. In contrast, the lower erosion produced during the Little Ice Age, when climate deteriorations are generally considered to result in an increased erosion signal in this region, suggests that anthropic activities dominated the erosive processes and completely masked the natural effects of climate on erosion in the late Holocene.