257 resultados para Load levels
Resumo:
BACKGROUND: International comparisons of social inequalities in alcohol use have not been extensively investigated. The purpose of this study was to examine the relationship of country-level characteristics and individual socio-economic status (SES) on individual alcohol consumption in 33 countries. METHODS: Data on 101,525 men and women collected by cross-sectional surveys in 33 countries of the GENACIS study were used. Individual SES was measured by highest attained educational level. Alcohol use measures included drinking status and monthly risky single occasion drinking (RSOD). The relationship between individuals' education and drinking indicators was examined by meta-analysis. In a second step the individual level data and country data were combined and tested in multilevel models. As country level indicators we used the Purchasing Power Parity of the gross national income, the Gini coefficient and the Gender Gap Index. RESULTS: For both genders and all countries higher individual SES was positively associated with drinking status. Also higher country level SES was associated with higher proportions of drinkers. Lower SES was associated with RSOD among men. Women of higher SES in low income countries were more often RSO drinkers than women of lower SES. The opposite was true in higher income countries. CONCLUSION: For the most part, findings regarding SES and drinking in higher income countries were as expected. However, women of higher SES in low and middle income countries appear at higher risk of engaging in RSOD. This finding should be kept in mind when developing new policy and prevention initiatives.
Resumo:
There is great interindividual variability in HIV-1 viral setpoint after seroconversion, some of which is known to be due to genetic differences among infected individuals. Here, our focus is on determining, genome-wide, the contribution of variable gene expression to viral control, and to relate it to genomic DNA polymorphism. RNA was extracted from purified CD4+ T-cells from 137 HIV-1 seroconverters, 16 elite controllers, and 3 healthy blood donors. Expression levels of more than 48,000 mRNA transcripts were assessed by the Human-6 v3 Expression BeadChips (Illumina). Genome-wide SNP data was generated from genomic DNA using the HumanHap550 Genotyping BeadChip (Illumina). We observed two distinct profiles with 260 genes differentially expressed depending on HIV-1 viral load. There was significant upregulation of expression of interferon stimulated genes with increasing viral load, including genes of the intrinsic antiretroviral defense. Upon successful antiretroviral treatment, the transcriptome profile of previously viremic individuals reverted to a pattern comparable to that of elite controllers and of uninfected individuals. Genome-wide evaluation of cis-acting SNPs identified genetic variants modulating expression of 190 genes. Those were compared to the genes whose expression was found associated with viral load: expression of one interferon stimulated gene, OAS1, was found to be regulated by a SNP (rs3177979, p = 4.9E-12); however, we could not detect an independent association of the SNP with viral setpoint. Thus, this study represents an attempt to integrate genome-wide SNP signals with genome-wide expression profiles in the search for biological correlates of HIV-1 control. It underscores the paradox of the association between increasing levels of viral load and greater expression of antiviral defense pathways. It also shows that elite controllers do not have a fully distinctive mRNA expression pattern in CD4+ T cells. Overall, changes in global RNA expression reflect responses to viral replication rather than a mechanism that might explain viral control.
Resumo:
No earlier study has investigated the microbiology of negative pressure wound therapy (NPWT) foam using a standardized manner. The purpose of this study is to investigate the bacterial load and microbiological dynamics in NPWT foam removed from chronic wounds (>3 months). To determine the bacterial load, a standardized size of the removed NPWT foam was sonicated. The resulting sonication fluid was cultured, and the colony-forming units (CFU) of each species were enumerated. Sixty-eight foams from 17 patients (mean age 63 years, 71% males) were investigated. In 65 (97%) foams, â0/00¥âeuro0/001 and in 37 (54%) â0/00¥2 bacterial types were found. The bacterial load remained high during NPWT treatment, ranging from 10(4) to 10(6) CFU/ml. In three patients (27%), additional type of bacteria was found in subsequent foam cultures. The mean bacterial countâeuro0/00±âeuro0/00standard deviation was higher in polyvinyl alcohol foam (6.1âeuro0/00±âeuro0/000.5 CFU/ml) than in polyurethane (5.5âeuro0/00±âeuro0/000.8 CFU/ml) (pâeuro0/00=âeuro0/000.02). The mean of log of sum of CFU/ml in foam from 125âeuro0/00mmHg (5.5âeuro0/00±âeuro0/000.8) was lower than in foam from 100âeuro0/00mmHg pressure (5.9âeuro0/00±âeuro0/000.5) (pâeuro0/00=âeuro0/000.01). Concluding, bacterial load remains high in NPWT foam, and routine changing does not reduce the load.
Resumo:
The pancreatic beta cell presents functional abnormalities in the early stages of development of non-insulin dependent diabetes mellitus (NIDDM). The disappearance of the first phase of insulin secretion induced by a glucose load is a early marker of NIDDM. This abnormality could be secondary to the low expression of the pancreatic glucose transporter GLUT2. Together with the glucokinase enzyme, GLUT2 is responsible for proper beta cell sensing of the extracellular glucose levels. In NIDDM, the GLUT2 mRNA levels are low, a fact which suggests a transcriptional defect of the GLUT2 gene. The first phase of glucose-induced insulin secretion by the beta pancreatic cell can be partly restored by the administration of a peptide discovered by a molecular approach, the glucagon-like peptide 1 (GLP-1). The gene encoding for the glucagon is expressed in a cell-specific manner in the A cells of the pancreatic islet and the L cells of the intestinal tract. The maturation process of the propeptide encoded by the glucagon gene is different in the two cells: the glucagon is the main hormone produced by the A cells whereas the glucagon-like peptide 1 (GLP-1) is the major peptide synthesized by the L cells of the intestine. GLP-1 is an incretin hormone and is at present the most potent insulinotropic peptide. The first results of the administration of GLP-1 to normal volunteers and diabetic patients are promising and may be a new therapeutic approach to treating diabetic patients.
Resumo:
Urea nitrogen, creatinine, and uric acid are relatively stable in postmortem serum and may, therefore, be used for diagnostic purposes when chronic kidney disease and end-stage renal failure are investigated as causes of death. Nevertheless, uncertainties remain in defining the best alternative to postmortem serum for the identification and assessment of significantly decreased kidney function. In this study, we investigated urea nitrogen, creatinine, and uric acid levels in postmortem serum, pericardial fluid, and vitreous humor in a series of medico-legal cases (500 autopsies) with various causes of death. No postmortem interval-related differences were observed in any of the investigated fluids for any analyzed parameter, confirming the biochemical stability of all compounds after death. Data analysis failed to reveal statistically significant differences between postmortem serum and pericardial fluid urea nitrogen, creatinine, and uric acid concentrations. Conversely, statistically significant differences were observed in all analyzed biomarkers between postmortem serum and vitreous humor levels, with lower concentrations of all markers measured in vitreous. The results of this study suggest that, in order to estimate as accurately as possible blood analyte concentrations at the time of death, pericardial fluid should be preferred to vitreous humor.
Resumo:
Neuropeptide Y (NPY) is present in the adrenal medulla, in sympathetic neurons as well as in the circulation. This peptide not only exerts a direct vasoconstrictor effect, but also potentiates the vasoconstriction evoked by norepinephrine and sympathetic nerve stimulation. The vasoconstrictor effect of norepinephrine is also enhanced by salt loading and reduced by salt depletion. The purpose of this study was therefore to assess whether there exists a relationship between dietary sodium intake and the levels of circulating NPY. Uninephrectomized normotensive rats were maintained for 3 weeks either on a low, a regular or a high sodium intake. On the day of the experiment, plasma levels of NPY and catecholamines were measured in the unanesthetized animals. There was no significant difference in plasma norepinephrine and epinephrine levels between the 3 groups of rats. Plasma NPY levels were the lowest (65.4 +/- 8.8 fmol/ml, n-10, Mean +/- SEM) in salt-restricted and the highest (151.2 +/- 25 fmol/ml, n-14, p less than 0.02) in salt-loaded animals. Intermediate values were obtained in rats kept on a regular sodium intake (117.6 +/- 20.1 fmol/ml). These findings are therefore compatible with the hypothesis that sodium balance might to some extent influence blood pressure regulation via changes in circulating NPY levels which in turn modify blood pressure responsiveness.
Resumo:
Six patients, five of whom had normal and one impaired renal function, and all suffering from purulent arthritis caused by cephalosporin-sensitive germs, were given a seven-day course of 8 g cephacetrile daily. On the first day, 6 g were administered by continuous intravenous infusion at the rate of 500 mg/h, followed by 2 g over a further 45 min. On days 2 to 7, the patients received 2 short infusions of 4 g each at an interval of 12 h. In four patients with normal renal function, serum half-life ranged from 0.8 to 1.4 h, serum levels during continuous infusion from 19 to 31 microgram/ml, and total clearances from 265 to 434 ml/min. In one patients, these values were 1.6 h, 70 microgram/ml and 131 ml/min respectively (small volume of distribution). The concentrations in the synovial fluid varied from 2 to 29 mcirogram/ml; they were generally lower than the serum levels, but clearly exceeded the minimum inhibitory concentrations for germs commonly present in purulent arthritis. In five patients, the synovial fluid became germ-free and the arthritis was clinically cured. In the case presenting with renal insufficiency, the serum half-life was 5.8 h. During continuous administration, a steady state was not attained; peak serum levels amo9nted to 75 microgram/ml and the total clearance to 61 ml/min. The cephacetrile concentrations in the synovial fluid were very high (26 and 67 microgram/ml). In this case, in which the renal insufficiency associated with mycosis fungoides was present before the treatment, renal function deteriorated futher during treatment while the arthritis improved.
Resumo:
Maintenance by the kidney of stable plasma K(+) values is crucial, as plasma K(+) controls muscle and nerve activity. Since renal K(+) excretion is regulated by the circadian clock, we aimed to identify the ion transporters involved in this process. In control mice, the renal mRNA expression of H,K-ATPase type 2 (HKA2) is 25% higher during rest compared to the activity period. Conversely, under dietary K(+) restriction, HKA2 expression is ∼40% higher during the activity period. This reversal suggests that HKA2 contributes to the circadian regulation of K(+) homeostasis. Compared to their wild-type (WT) littermates, HKA2-null mice fed a normal diet have 2-fold higher K(+) renal excretion during rest. Under K(+) restriction, their urinary K(+) loss is 40% higher during the activity period. This inability to excrete K(+) "on time" is reflected in plasma K(+) values, which vary by 12% between activity and rest periods in HKA2-null mice but remain stable in WT mice. Analysis of the circadian expression of HKA2 regulators suggests that Nrf2, but not progesterone, contributes to its rhythmicity. Therefore, HKA2 acts to maintain the circadian rhythm of urinary K(+) excretion and preserve stable plasma K(+) values throughout the day.
Resumo:
Combining measurements of the monoamine metabolites in the cerebrospinal fluid (CSF) and neuroimaging can increase efficiency of drug discovery for treatment of brain disorders. To address this question, we examined five drug-naïve patients suffering from schizophrenic disorder. Patients were assessed clinically, using the Positive and Negative Syndrome Scale (PANSS): at baseline and then at weekly intervals. Plasma and CSF levels of quetiapine and norquetiapine as well CSF 3,4-dihydroxyphenylacetic acid (DOPAC), homovanillic acid (HVA), 5-hydroxyindole-acetic acid (5-HIAA) and 3-methoxy-4-hydroxyphenylglycol (MHPG) were obtained at baseline and again after at least a 4 week medication trail with 600 mg/day quetiapine. CSF monoamine metabolites levels were compared with dopamine D(2) receptor occupancy (DA-D(2)) using [(18)F]fallypride and positron emission tomography (PET). Quetiapine produced preferential occupancy of parietal cortex vs. putamenal DA-D(2), 41.4% (p<0.05, corrected for multiple comparisons). DA-D(2) receptor occupancies in the occipital and parietal cortex were correlated with CSF quetiapine and norquetiapine levels (p<0.01 and p<0.05, respectively). CSF monoamine metabolites were significantly increased after treatment and correlated with regional receptor occupancies in the putamen [DOPAC: (p<0.01) and HVA: (p<0.05)], caudate nucleus [HVA: (p<0.01)], thalamus [MHPG: (p<0.05)] and in the temporal cortex [HVA: (p<0.05) and 5-HIAA: (p<0.05)]. This suggests that CSF monoamine metabolites levels reflect the effects of quetiapine treatment on neurotransmitters in vivo and indicates that monitoring plasma and CSF quetiapine and norquetiapine levels may be of clinical relevance.
Resumo:
BACKGROUND: Iterative reconstruction (IR) techniques reduce image noise in multidetector computed tomography (MDCT) imaging. They can therefore be used to reduce radiation dose while maintaining diagnostic image quality nearly constant. However, CT manufacturers offer several strength levels of IR to choose from. PURPOSE: To determine the optimal strength level of IR in low-dose MDCT of the cervical spine. MATERIAL AND METHODS: Thirty consecutive patients investigated by low-dose cervical spine MDCT were prospectively studied. Raw data were reconstructed using filtered back-projection and sinogram-affirmed IR (SAFIRE, strength levels 1 to 5) techniques. Image noise, signal-to-noise ratio (SNR), and contrast-to-noise ratio (CNR) were measured at C3-C4 and C6-C7 levels. Two radiologists independently and blindly evaluated various anatomical structures (both dense and soft tissues) using a 4-point scale. They also rated the overall diagnostic image quality using a 10-point scale. RESULTS: As IR strength levels increased, image noise decreased linearly, while SNR and CNR both increased linearly at C3-C4 and C6-C7 levels (P < 0.001). For the intervertebral discs, the content of neural foramina and dural sac, and for the ligaments, subjective image quality scores increased linearly with increasing IR strength level (P ≤ 0.03). Conversely, for the soft tissues and trabecular bone, the scores decreased linearly with increasing IR strength level (P < 0.001). Finally, the overall diagnostic image quality scores increased linearly with increasing IR strength level (P < 0.001). CONCLUSION: The optimal strength level of IR in low-dose cervical spine MDCT depends on the anatomical structure to be analyzed. For the intervertebral discs and the content of neural foramina, high strength levels of IR are recommended.
Resumo:
Background: Although combination antiretroviral therapy (cART) dramatically reduces rates of AIDS and death, a minority of patients experience clinical disease progression during treatment. <p>Objective: To investigate whether detection of CXCR4(X4)-specific strains or quantification of X4-specific HIV-1 load predict clinical outcome. Methods: From the Swiss HIV Cohort Study, 96 participants who initiated cART yet subsequently progressed to AIDS or death were compared with 84 contemporaneous, treated nonprogressors. A sensitive heteroduplex tracking assay was developed to quantify plasma X4 and CCR5 variants and resolve HIV-1 load into coreceptor-specific components. Measurements were analyzed as cofactors of progression in multivariable Cox models adjusted for concurrent CD4 cell count and total viral load, applying inverse probability weights to adjust for sampling bias. Results: Patients with X4 variants at baseline displayed reduced CD4 cell responses compared with those without X4 strains (40 versus 82 cells/mu l; P= 0.012). The adjusted multivariable hazard ratio (HR) for clinical progression was 4.8 [95% confidence interval (Cl) 2.3-10.0] for those demonstrating X4 strains at baseline. The X4-specific HIV-1 load was a similarly independent predictor, with HR values of 3.7(95%Cl, 1.2-11.3) and 5.9 (95% Cl, 2.2-15.0) for baseline loads of 2.2-4.3 and > 4.3 log(10)copies/ml, respectively, compared with < 2.2 log(10)copies/ml. Conclusions: HIV-1 coreceptor usage and X4-specific viral loads strongly predicted disease progression during cART, independent of and in addition to CD4 cell count or total viral load. Detection and quantification of X4 strains promise to be clinically useful biomarkers to guide patient management and study HIV-1 pathogenesis.
Resumo:
The role of peroxisome proliferator activator receptor (PPAR)β/δ in the pathogenesis of Alzheimer's disease has only recently been explored through the use of PPARβ/δ agonists. Here we evaluated the effects of PPARβ/δ deficiency on the amyloidogenic pathway and tau hyperphosphorylation. PPARβ/δ-null mice showed cognitive impairment in the object recognition task, accompanied by enhanced DNA-binding activity of NF-κB in the cortex and increased expression of IL-6. In addition, two NF-κB-target genes involved in β-amyloid (Aβ) synthesis and deposition, the β site APP cleaving enzyme 1 (Bace1) and the receptor for advanced glycation endproducts (Rage), respectively, increased in PPARβ/δ-null mice compared to wild type animals. The protein levels of glial fibrillary acidic protein (GFAP) increased in the cortex of PPARβ/δ-null mice, which would suggest the presence of astrogliosis. Finally, tau hyperphosphorylation at Ser199 and enhanced levels of PHF-tau were associated with increased levels of the tau kinases CDK5 and phospho-ERK1/2 in the cortex of PPARβ/δ(-/-) mice. Collectively, our findings indicate that PPARβ/δ deficiency results in cognitive impairment associated with enhanced inflammation, astrogliosis and tau hyperphosphorylation in the cortex.
Resumo:
BACKGROUND: Estimates of drug resistance incidence to modern first-line combination antiretroviral therapies against human immunodeficiency virus (HIV) type 1 are complicated by limited availability of genotypic drug resistance tests (GRTs) and uncertain timing of resistance emergence. METHODS: Five first-line combinations were studied (all paired with lamivudine or emtricitabine): efavirenz (EFV) plus zidovudine (AZT) (n = 524); EFV plus tenofovir (TDF) (n = 615); lopinavir (LPV) plus AZT (n = 573); LPV plus TDF (n = 301); and ritonavir-boosted atazanavir (ATZ/r) plus TDF (n = 250). Virological treatment outcomes were classified into 3 risk strata for emergence of resistance, based on whether undetectable HIV RNA levels were maintained during therapy and, if not, whether viral loads were >500 copies/mL during treatment. Probabilities for presence of resistance mutations were estimated from GRTs (n = 2876) according to risk stratum and therapy received at time of testing. On the basis of these data, events of resistance emergence were imputed for each individual and were assessed using survival analysis. Imputation was repeated 100 times, and results were summarized by median values (2.5th-97.5th percentile range). RESULTS: Six years after treatment initiation, EFV plus AZT showed the highest cumulative resistance incidence (16%) of all regimens (<11%). Confounder-adjusted Cox regression confirmed that first-line EFV plus AZT (reference) was associated with a higher median hazard for resistance emergence, compared with other treatments: EFV plus TDF (hazard ratio [HR], 0.57; range, 0.42-0.76), LPV plus AZT (HR, 0.63; range, 0.45-0.89), LPV plus TDF (HR, 0.55; range, 0.33-0.83), ATZ/r plus TDF (HR, 0.43; range, 0.17-0.83). Two-thirds of resistance events were associated with detectable HIV RNA level ≤500 copies/mL during treatment, and only one-third with virological failure (HIV RNA level, >500 copies/mL). CONCLUSIONS: The inclusion of TDF instead of AZT and ATZ/r was correlated with lower rates of resistance emergence, most likely because of improved tolerability and pharmacokinetics resulting from a once-daily dosage.
Resumo:
How does the multi-sensory nature of stimuli influence information processing? Cognitive systems with limited selective attention can elucidate these processes. Six-year-olds, 11-year-olds and 20-year-olds engaged in a visual search task that required them to detect a pre-defined coloured shape under conditions of low or high visual perceptual load. On each trial, a peripheral distractor that could be either compatible or incompatible with the current target colour was presented either visually, auditorily or audiovisually. Unlike unimodal distractors, audiovisual distractors elicited reliable compatibility effects across the two levels of load in adults and in the older children, but high visual load significantly reduced distraction for all children, especially the youngest participants. This study provides the first demonstration that multi-sensory distraction has powerful effects on selective attention: Adults and older children alike allocate attention to potentially relevant information across multiple senses. However, poorer attentional resources can, paradoxically, shield the youngest children from the deleterious effects of multi-sensory distraction. Furthermore, we highlight how developmental research can enrich the understanding of distinct mechanisms controlling adult selective attention in multi-sensory environments.