924 resultados para ALLOSTATIC LOAD
Resumo:
Background: Access to hepatitis B viral load (VL) testing is poor in sub-Saharan Africa (SSA) due toeconomic and logistical reasons.Objectives: To demonstrate the feasibility of testing dried blood spots (DBS) for hepatitis B virus (HBV)VL in a laboratory in Lusaka, Zambia, and to compare HBV VLs between DBS and plasma samples.Study design: Paired plasma and DBS samples from HIV-HBV co-infected Zambian adults were analyzedfor HBV VL using the COBAS AmpliPrep/COBAS TaqMan HBV test (Version 2.0) and for HBV genotypeby direct sequencing. We used Bland-Altman analysis to compare VLs between sample types and bygenotype. Logistic regression analysis was conducted to assess the probability of an undetectable DBSresult by plasma VL.Results: Among 68 participants, median age was 34 years, 61.8% were men, and median plasma HBV VLwas 3.98 log IU/ml (interquartile range, 2.04–5.95). Among sequenced viruses, 28 were genotype A1 and27 were genotype E. Bland–Altman plots suggested strong agreement between DBS and plasma VLs. DBSVLs were on average 1.59 log IU/ml lower than plasma with 95% limits of agreement of −2.40 to −0.83 logIU/ml. At a plasma VL ≥2,000 IU/ml, the probability of an undetectable DBS result was 1.8% (95% CI:0.5–6.6). At plasma VL ≥20,000 IU/ml this probability reduced to 0.2% (95% CI: 0.03–1.7).
Resumo:
Rockfall protection barriers are connected to the ground using steel cables fixed with anchors and foundations for the steel posts. It is common practice to measure the forces in the cables, while to date measurements of forces in the foundations have been inadequately resolved. An overview is presented of existing methods to measure the loads on the post foundations of rockfall protection barriers. Addressing some of the inadequacies of existing approaches, a novel sensor unit is presented that is able to capture the forces acting on post foundations in all six degrees of freedom. The sensor unit consists of four triaxial force sensors placed between two steel plates. To correctly convert the measurements into the directional forces acting on the foundation a special in-situ calibration procedure is proposed that delivers a corresponding conversion matrix.
Resumo:
Visual neglect is considerably exacerbated by increases in visual attentional load. These detrimental effects of attentional load are hypothesised to be dependent on an interplay between dysfunctional inter-hemispheric inhibitory dynamics and load-related modulation of activity in cortical areas such as the posterior parietal cortex (PPC). Continuous Theta Burst Stimulation (cTBS) over the contralesional PPC reduces neglect severity. It is unknown, however, whether such positive effects also operate in the presence of the detrimental effects of heightened attentional load. Here, we examined the effects of cTBS on neglect severity in overt visual search (i.e., with eye movements), as a function of high and low visual attentional load conditions. Performance was assessed on the basis of target detection rates and eye movements, in a computerised visual search task and in two paper-pencil tasks. cTBS significantly ameliorated target detection performance, independently of attentional load. These ameliorative effects were significantly larger in the high than the low load condition, thereby equating target detection across both conditions. Eye movement analyses revealed that the improvements were mediated by a redeployment of visual fixations to the contralesional visual field. These findings represent a substantive advance, because cTBS led to an unprecedented amelioration of overt search efficiency that was independent of visual attentional load.
Resumo:
BACKGROUND HIV-1 RNA viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not available in many resource-limited settings. We developed and validated CD4-based risk charts to guide targeted VL testing. METHODS We modeled the probability of virologic failure up to 5 years of ART based on current and baseline CD4 counts, developed decision rules for targeted VL testing of 10%, 20%, or 40% of patients in 7 cohorts of patients starting ART in South Africa, and plotted cutoffs for VL testing on colour-coded risk charts. We assessed the accuracy of risk chart-guided VL testing to detect virologic failure in validation cohorts from South Africa, Zambia, and the Asia-Pacific. RESULTS In total, 31,450 adult patients were included in the derivation and 25,294 patients in the validation cohorts. Positive predictive values increased with the percentage of patients tested: from 79% (10% tested) to 98% (40% tested) in the South African cohort, from 64% to 93% in the Zambian cohort, and from 73% to 96% in the Asia-Pacific cohort. Corresponding increases in sensitivity were from 35% to 68% in South Africa, from 55% to 82% in Zambia, and from 37% to 71% in Asia-Pacific. The area under the receiver operating curve increased from 0.75 to 0.91 in South Africa, from 0.76 to 0.91 in Zambia, and from 0.77 to 0.92 in Asia-Pacific. CONCLUSIONS CD4-based risk charts with optimal cutoffs for targeted VL testing maybe useful to monitor ART in settings where VL capacity is limited.
Resumo:
The Out-of-Africa (OOA) dispersal ∼50,000 y ago is characterized by a series of founder events as modern humans expanded into multiple continents. Population genetics theory predicts an increase of mutational load in populations undergoing serial founder effects during range expansions. To test this hypothesis, we have sequenced full genomes and high-coverage exomes from seven geographically divergent human populations from Namibia, Congo, Algeria, Pakistan, Cambodia, Siberia, and Mexico. We find that individual genomes vary modestly in the overall number of predicted deleterious alleles. We show via spatially explicit simulations that the observed distribution of deleterious allele frequencies is consistent with the OOA dispersal, particularly under a model where deleterious mutations are recessive. We conclude that there is a strong signal of purifying selection at conserved genomic positions within Africa, but that many predicted deleterious mutations have evolved as if they were neutral during the expansion out of Africa. Under a model where selection is inversely related to dominance, we show that OOA populations are likely to have a higher mutation load due to increased allele frequencies of nearly neutral variants that are recessive or partially recessive.
Resumo:
Expanding populations incur a mutation burden – the so-called expansion load. Previous studies of expansion load have focused on codominant mutations. An important consequence of this assumption is that expansion load stems exclusively from the accumulation of new mutations occurring in individuals living at the wave front. Using individual-based simulations, we study here the dynamics of standing genetic variation at the front of expansions, and its consequences on mean fitness if mutations are recessive. We find that deleterious genetic diversity is quickly lost at the front of the expansion, but the loss of deleterious mutations at some loci is compensated by an increase of their frequencies at other loci. The frequency of deleterious homozygotes therefore increases along the expansion axis, whereas the average number of deleterious mutations per individual remains nearly constant across the species range. This reveals two important differences to codominant models: (i) mean fitness at the front of the expansion drops much faster if mutations are recessive, and (ii) mutation load can increase during the expansion even if the total number of deleterious mutations per individual remains constant. We use our model to make predictions about the shape of the site frequency spectrum at the front of range expansion, and about correlations between heterozygosity and fitness in different parts of the species range. Importantly, these predictions provide opportunities to empirically validate our theoretical results. We discuss our findings in the light of recent results on the distribution of deleterious genetic variation across human populations and link them to empirical results on the correlation of heterozygosity and fitness found in many natural range expansions.
Resumo:
BACKGROUND Few data on the virological determinants of hepatitis B virus (HBV) infection are available from southern Africa. METHODS We enrolled consecutive HIV-infected adult patients initiating antiretroviral therapy (ART) at two urban clinics in Zambia and four rural clinics in Northern Mozambique between May 2013 and August 2014. HBsAg screening was performed using the Determine® rapid test. Quantitative real-time PCR and HBV sequencing were performed in HBsAg-positive patients. Risk factors for HBV infection were evaluated using Chi-square and Mann-Whitney tests and associations between baseline characteristics and high level HBV replication explored in multivariable logistic regression. RESULTS Seventy-eight of 1,032 participants in Mozambique (7.6%, 95% confidence interval [CI]: 6.1-9.3) and 90 of 797 in Zambia (11.3%, 95% CI: 9.3-13.4) were HBsAg-positive. HBsAg-positive individuals were less likely to be female compared to HBsAg-negative ones (52.3% vs. 66.1%, p<0.001). Among 156 (92.9%) HBsAg-positive patients with an available measurement, median HBV viral load was 13,645 IU/mL (interquartile range: 192-8,617,488 IU/mL) and 77 (49.4%) had high values (>20,000 UI/mL). HBsAg-positive individuals had higher levels of ALT and AST compared to HBsAg-negative ones (both p<0.001). In multivariable analyses, male sex (adjusted odds ratio: 2.59, 95% CI: 1.22-5.53) and CD4 cell count below 200/μl (2.58, 1.20-5.54) were associated with high HBV DNA. HBV genotypes A1 (58.8%) and E (38.2%) were most prevalent. Four patients had probable resistance to lamivudine and/or entecavir. CONCLUSION One half of HBsAg-positive patients demonstrated high HBV viremia, supporting the early initiation of tenofovir-containing ART in HIV/HBV-coinfected adults.
Resumo:
Several studies have examined the association between high glycemic index (GI) and glycemic load (GL) diets and the risk for coronary heart disease (CHD). However, most of these studies were conducted primarily on white populations. The primary aim of this study was to examine whether high GI and GL diets are associated with increased risk for developing CHD in whites and African Americans, non-diabetics and diabetics, and within stratifications of body mass index (BMI) and hypertension (HTN). Baseline and 17-year follow-up data from ARIC (Atherosclerosis Risk in Communities) study was used. The study population (13,051) consisted of 74% whites, 26% African Americans, 89% non-diabetics, 11% diabetics, 43% male, 57% female aged 44 to 66 years at baseline. Data from the ARIC food frequency questionnaire at baseline were analyzed to provide GI and GL indices for each subject. Increases of 25 and 30 units for GI and GL respectively were used to describe relationships on incident CHD risk. Adjusted hazard ratios for propensity score with 95% confidence intervals (CI) were used to assess associations. During 17 years of follow-up (1987 to 2004), 1,683 cases of CHD was recorded. Glycemic index was associated with 2.12 fold (95% CI: 1.05, 4.30) increased incident CHD risk for all African Americans and GL was associated with 1.14 fold (95% CI: 1.04, 1.25) increased CHD risk for all whites. In addition, GL was also an important CHD risk factor for white non-diabetics (HR=1.59; 95% CI: 1.33, 1.90). Furthermore, within stratum of BMI 23.0 to 29.9 in non-diabetics, GI was associated with an increased hazard ratio of 11.99 (95% CI: 2.31, 62.18) for CHD in African Americans, and GL was associated with 1.23 fold (1.08, 1.39) increased CHD risk in whites. Body mass index modified the effect of GI and GL on CHD risk in all whites and white non-diabetics. For HTN, both systolic blood pressure and diastolic blood pressure modified the effect on GI and GL on CHD risk in all whites and African Americans, white and African American non-diabetics, and white diabetics. Further studies should examine other factors that could influence the effects of GI and GL on CHD risk, including dietary factors, physical activity, and diet-gene interactions. ^
Resumo:
Background. The Cypress Creek is one of the main tributaries of Lake Houston, which provides drinking water to 21.4 million customers. Furthermore, the watershed is being utilized for contact and non-contact recreation, such as canoeing, swimming, hiking trail, and picnics. Water along the creek is impacted by numerous wastewater outfalls from both point and non-point sources. As the creek flows into Lake Houston, it carries both organic and inorganic contaminants that may affect the drinking water quality of this important water source reservoir. Objective. This study was carried out to evaluate the inorganic chemical load of the water in Cypress Creek along its entire length, from the headwaters in Waller County and up to the drainage into Lake Houston. The purpose was to determine whether there are hazardous concentrations of metals in the water and what would be the likely sources. Method. Samples were collected at 29 sites along the creek and analyzed for 29 metals, 17 of which were on the Environmental Protection Agency priority pollution list. Public access sites primarily at bridges were used for sample collection. Samples were transported on ice to the University Of Texas School Of Public Health laboratory, spiked with 2 ml HNO3 kept overnight in the refrigerator, and the following day transported to the EPA laboratory for analysis. Analysis was done by EPA Method 200.7-ICP, Method 200.8ICP/MS and Method 245.1-CVAAS. Results. Metals were present above the detection limits at 65% of sites. Concentrations of aluminum, iron, sodium, potassium, magnesium, and calcium, were particularly high at all sites. Aluminum, sodium, and iron concentrations greatly exceeded the EPA secondary drinking water standards at all sites. ^ Conclusion. The recreational water along Cypress Creek is impacted by wastewater from both permitted and non-permitted outfalls, which deposit inorganic substances into the water. Although a number of inorganic contaminants were present in the water, toxic metals regulated by the EPA were mostly below the recommended limits. However, high concentrations of aluminum, sodium, and iron in the Cypress Creek bring forward the issue of unauthorized discharges of salt water from mining, as well as industrial and domestic wastewater.^
Resumo:
Apple scab, caused by Venturia inaequalis, is a major disease affecting apple production. Breeding programs have developed over 30 releases of scab-resistant cultivars since 1970 with recent ones having much improved quality. Redfree and GoldRush are from a cooperative breeding program involving Purdue, Rutgers, and Illinois universities while Liberty was introduced from the Cornell University breeding program. For these cultivars to gain better acceptance, more information is needed on their cropping capacities and the effect of crop load on fruit quality attributes. Our study was conducted to determine the relationship between increasing crop load on tree growth, fruit size, and fruit quality variables of the three cultivars under Iowa conditions.