64 resultados para Atmospheric plasma treatments
Resumo:
Anti-human immunodeficiency virus type 1 (HIV-1) "binding antibodies" (antibodies capable of binding to synthetic peptides or proteins) occur throughout HIV-1 infection, are high-titered and highly cross-reactive, as confirmed in this study by analyzing plasma from B and F genotype HIV-1 infected individuals. Plasma from individuals infected with clade F HIV-1 displayed the most frequent cross-reactivity, in high titers, while Bbr plasma showed much higher specificity. Similarly, neutralization of a reference HIV-1 isolate (HIV-1 MN) was more frequently observed by plasma from F than B genotype infected individuals. No significant difference was seen in neutralization susceptibility of primary B, Bbr or F clade HIV-1 by plasma from individuals infected with the classical B (GPGR) or F HIV-1, but Bbr (GWGR) plasma were less likely to neutralize the F genotype primary HIV-1 isolates. The data indicate that both B and F genotype derived vaccines would be equally effective against B and F HIV-1 infection, with a slightly more probable effectiveness for F than B genotype. Although the Bbr variant appears to induce a much more specific humoral immune response, the susceptibility in neutralizing the Brazilian HIV-1 B genotype Bbr variant is similar to that observed with the classical B genotype HIV-1.
Resumo:
In spite of its widespread use, benznidazole's (BNZ) toxicity and low efficacy remains as major drawbacks that impair successful treatments against Chagas disease. Previously, attempting to increase the selectivity and reduce its toxicity on infected tissues, multilamellar liposomes (MLV) composed of hydrogenated soybean phosphatidylcholine (HSPC): distearoyl-phosphatidylglycerol (DSPG): cholesterol (CHOL) 2:1:2 mol:mol loaded with BNZ (MLV-BNZ) were designed. In this work we compared different properties of MLV-BNZ with those of BNZ. Opposite to other hydrophobic drugs, the results indicated that slight changes of BNZ×s association degree to proteins and lipoproteins should not modify the percentage of unbound drug available to exert pharmacological action. On the other hand, when loaded in MLV, BNZ reduced its association to plasma proteins in 45% and became refractory to the sinking effect of blood, dropping 4.5 folds. Additionally, when loaded in MLV, BNZ had higher volume distribution (160 ± 20 vs 102 ± 15 ml/kg) and total clearance (35.23 ± 2.3 vs 21.9 ± 1.4 ml/h.kg), and lower concentration-time curve (7.23 ± 0.2 vs 9.16 ± 0.5 µg.h/ml) than BNZ. Hence, these studies showed that for MLV-BNZ, the amount of BNZ can be substantially increased, from 25 to 70%, being this formulation more rapidly cleared from circulation than free drug; also due to the lower interaction with blood components, lower side effects can be expected.
Resumo:
This study aims to investigate the importance of the serum factors present in the plasma of resistant Biomphalaria tenagophila snails, when transferred to susceptible conspecific. Susceptible B. tenagophila (CF) received plasma from resistant B. tenagophila (Taim), and both were later infected with Schistosoma mansoni. We noticed that the plasma transfer showed an increase on the resistance of susceptible snails of about 86% when compared to the non-immunized group (p < 0.001).
Resumo:
Malaria remains an important health problem in tropical countries like Brazil. Thrombocytopenia is the most common hematological disturbance seen in malarial infection. Oxidative stress (OS) has been implicated as a possible mediator of thrombocytopenia in patients with malaria. This study aimed to investigate the role of OS in the thrombocytopenia of Plasmodium vivax malaria through the measurement of oxidant and antioxidant biochemical markers in plasma and in isolated platelets. Eighty-six patients with P. vivax malaria were enrolled. Blood samples were analyzed for total antioxidant and oxidant status, albumin, total protein, uric acid, zinc, magnesium, bilirubin, total thiols, glutathione peroxidase (GPx), malondialdehyde (MDA), antibodies against mildly oxidized low-density lipoproteins (LDL-/nLDL ratio) and nitrite/nitrate levels in blood plasma and GPx and MDA in isolated platelets. Plasma MDA levels were higher in thrombocytopenic (TCP) (median 3.47; range 1.55-12.90 µmol/L) compared with the non-thrombocytopenic (NTCP) patients (median 2.57; range 1.95-8.60 µmol/L). Moreover, the LDL-/nLDL autoantibody ratio was lower in TCP (median 3.0; range 1.5-14.8) than in NTCP patients (median 4.0; range 1.9-35.5). Finally, GPx and MDA were higher in the platelets of TPC patients. These results suggest that oxidative damage of platelets might be important in the pathogenesis of thrombocytopenia found in P. vivax malaria as indicated by alterations of GPx and MDA.
Resumo:
Clinical trials comparing different drug regimens and strategies for the treatment of congenital toxoplasmosis and its clinical manifestations in the liveborn child in different clinical settings should aim at formally evaluating the net benefit of existing treatments and at developing new therapeutic options. Currently, there is no ideal drug for congenital toxoplasmosis; future research should focus on the screening of new active drugs and on their pre-clinical and early clinical development, with a focus on pharmacokinetic/dynamic studies and teratogenicity. For the prenatal treatment of congenital toxoplasmosis, a trial comparing spiramycine to pyrimethamine-sulphadiazine and placebo would allow a formal estimation of the effect of both drugs in infected pregnant women. In newborn children, the net benefit of pyrimethamine-sulphadiazine should also be formally assessed. These trials will be implemented in settings where prenatal screening for Toxoplasma gondii is currently implemented. Trials should be carefully designed to allow for translation to other settings and modelling tools like cost-effectiveness analysis should be used to provide clinicians and founders with the best available evidence to establish recommendations.
Resumo:
Chemokines recruit and activate leukocytes, assisting granuloma formation. Herein, we evaluated plasma chemokines in patients with active tuberculosis (ATB) and after completing treatment (TTB) and compared them to BCG-vaccinated healthy controls (HC). Levels of chemokines were measured by cytometric bead array. Levels of CXCL8, CXCL9 and CXCL10 were higher in ATB patients compared to HC, but they decreased in TTB. Levels of CCL2 and CCL5 in ATB patients were similar to those observed in HC. Thus, the high levels of CXC-chemokines detected during ATB, which can modulate the trafficking of immune cells from the periphery to the site of infection, were reversed by anti-mycobacterial treatment.
Resumo:
The goal of this study was to evaluate changes in plasma human immunodeficiency virus (HIV) RNA concentration [viral load (VL)] and CD4+ percentage (CD4%) during 6-12 weeks postpartum (PP) among HIV-infected women and to assess differences according to the reason for receipt of antiretrovirals (ARVs) during pregnancy [prophylaxis (PR) vs. treatment (TR)]. Data from a prospective cohort of HIV-infected pregnant women (National Institute of Child Health and Human Development International Site Development Initiative Perinatal Study) were analyzed. Women experiencing their first pregnancy who received ARVs for PR (started during pregnancy, stopped PP) or for TR (initiated prior to pregnancy and/or continued PP) were included and were followed PP. Increases in plasma VL (> 0.5 log10) and decreases in CD4% (> 20% relative decrease in CD4%) between hospital discharge (HD) and PP were assessed. Of the 1,229 women enrolled, 1,119 met the inclusion criteria (PR: 601; TR: 518). At enrollment, 87% were asymptomatic. The median CD4% values were: HD [34% (PR); 25% (TR)] and PP [29% (PR); 24% (TR)]. The VL increases were 60% (PR) and 19% (TR) (p < 0.0001). The CD4% decreases were 36% (PR) and 18% (TR) (p < 0.0001). Women receiving PR were more likely to exhibit an increase in VL [adjusted odds ratio (AOR) 7.7 (95% CI: 5.5-10.9) and a CD4% decrease (AOR 2.3; 95% CI: 1.6-3.2). Women receiving PR are more likely to have VL increases and CD4% decreases compared to those receiving TR. The clinical implications of these VL and CD4% changes remain to be explored.
Resumo:
It is not well established whether cytokine production differs in response to different clinical forms of visceral leishmaniasis (VL). In this work, we performed a cross-sectional study to investigate the plasma levels of cytokines [interferon (IFN)-γ, tumour necrosis factor (TNF)-α, interleukin (IL)-2, IL-4, IL-10 and IL-12] involved in the pathogenesis of VL in 80 subjects from VL endemic areas, including subjects with active VL, subjects with asymptomatic infection, subjects with cured VL and uninfected controls. The patients were recruited by sampling from a referral hospital and by random selection from a population-based cohort study. The results showed significant differences in the plasma concentration of all cytokines between the groups (p < 0.05). Patients with the active disease had higher plasma levels of IL-10, IL-4, INF-γ and TNF-α relative to the other groups and they produced more IL-12 than asymptomatic and cured subjects. Only the IL-2 concentration was higher in the asymptomatic and cured subjects relative to the patients with active disease (p < 0.05). Our results suggest that these cytokines can be used as markers in epidemiological studies conducted in endemic areas to distinguish between different clinical forms of VL. However, their usefulness should be confirmed in investigations conducted in other endemic areas.
Resumo:
This cross-sectional study aimed to analyze the adherence to drug and non-drug treatments in 17 Family Health Strategy units. A total of 423 patients with type 2 diabetes mellitus were selected through stratified random sampling in Family Health Strategy units of a city in the state of Minas Gerais, Brazil, in 2010. The results showed that the prevalence rate of adherence to drug therapy was higher than 60% in the 17 units investigated; in relation to physical activity, adherence was higher than 60% in 58.8% units; and for the diet plan, there was no adherence in 52.9% units. Therefore, we concluded that adherence to drug therapy in most units was high and the practice of physical activity was heterogeneous, and in relation to diet adherence, it was low in all units. We recommend strengthening of institutional guidelines and educational strategies, in line with SUS guidelines, so that, professionals may face the challenges imposed by the lack of adherence.
Resumo:
Para aumentar a precisão nas análises químicas de fertilidade do solo e dosar simultaneamente vários elementos, alguns laboratórios vêm optando pelo uso da técnica da espectrofotometria de emissão ótica em plasma induzido (ICP), em detrimento da técnica da espectrofotometria de absorção atômica (EAA), hoje comumente utilizada nos laboratórios de análise de solos. Este trabalho, além de comparar as duas técnicas de dosagem quanto à precisão, à reprodutibilidade e à magnitude dos teores dos micronutrientes Fe, Zn, Cu e Mn, extraídos por Mehlich-1, Mehlich-3 e DTPA-TEA, objetivou, também, selecionar os comprimentos de onda que apresentam menores interferências espectrais no ICP. Foram utilizadas 36 amostras (0 a 0,2 m) de solos coletadas nos Estados de Minas Gerais e Bahia, com ampla variação nos teores de micronutrientes, sendo selecionados três solos para definir os comprimentos de onda do ICP e avaliar a precisão e a reprodutibilidade dos métodos de dosagem. Os comprimentos de onda com menores interferências espectrais no ICP foram: 259,939 nm para Fe em Mehlich-1 e DTPA-TEA e 234,349 nm em Mehlich-3; 213,857 nm para Zn e 324,752 nm para Cu nos três extratores; e 259,372 nm para Mn em Mehlich-1 e DTPA-TEA e 260,568 nm em Mehlich-3. Tanto o ICP quanto o EAA foram precisos e reprodutíveis nas dosagens de Fe e Mn, sendo o ICP, em virtude do seu menor limite de detecção, mais preciso e reprodutível nas dosagens de Zn e Cu. Os métodos de dosagem diferiram estatisticamente (p < 0,01) pelo teste de identidade aplicado, para as dosagens de Fe, Zn, Cu e Mn, utilizando Mehlich-1, Mehlich-3 e DTPA-TEA, comprometendo assim a interpretação dos resultados gerados pelo ICP, com base nos níveis críticos gerados a partir do EAA.
Resumo:
A sustainable management of soils with low natural fertility on family farms in the humid tropics is a great challenge and overcoming it would be an enormous benefit for the environment and the farmers. The objective of this study was to assess the environmental and agronomic benefits of alley cropping, based on the evaluation of C sequestration, soil quality indicators, and corn yields. Combinations of four legumes were used in alley cropping systems in the following treatments: Clitoria fairchildiana + Cajanus cajan; Acacia mangium + Cajanus cajan; Leucaena leucocephala + Cajanus cajan; Clitoria fairchildiana + Leucaena leucocephala; Leucaena leucocephala + Acacia mangium and a control. Corn was used as a cash crop. The C content was determined in the different compartments of soil organic matter, CEC, available P, base saturation, percentage of water saturation, the period of the root hospitality factor below the critical level and corn yield. It was concluded that alley cropping could substitute the slash and burn system in the humid tropics. The main environmental benefit of alley cropping is the maintenance of a dynamic equilibrium between C input and output that could sustain up to 10 Mg ha-1 of C in the litter layer, decreasing atmospheric CO2 levels. Alley cropping is also beneficial from the agricultural point of view, because it increases base saturation and decreases physical resistance to root penetration in the soil layer 0 - 10 cm, which ensures the increase and sustainability of corn yield.
Resumo:
Nitrogen fertilizers increase the nitrous oxide (N2O) emission and can reduce the methane (CH4) oxidation from agricultural soils. However, the magnitude of this effect is unknown in Southern Brazilian edaphoclimatic conditions, as well as the potential of different sources of mineral N fertilizers in such an effect. The aim of this study was to investigate the effects of different mineral N sources (urea, ammonium sulphate, calcium nitrate, ammonium nitrate, Uran, controlled- release N fertilizer, and urea with urease inhibitor) on N2O and CH4 fluxes from Gleysol in the South of Brazil (Porto Alegre, RS), in comparison to a control treatment without a N application. The experiment was arranged in a randomized block with three replications, and the N fertilizer was applied to corn at the V5 growth stage. Air samples were collected from a static chambers for 15 days after the N application and the N2O and CH4 concentration were determined by gas chromatography. The topmost emissions occurred three days after the N fertilizer application and ranged from 187.8 to 8587.4 µg m-2 h-1 N. The greatest emissions were observed for N-nitric based fertilizers, while N sources with a urease inhibitor and controlled release N presented the smallest values and the N-ammonium and amidic were intermediate. This peak of N2O emissions was related to soil NO3--N (R² = 0.56, p < 0.08) when the soil water-filled pore space was up to 70 % and it indicated that N2O was predominantly produced by a denitrification process in the soil. Soil CH4 fluxes ranged from -30.1 µg m-2 h-1 C (absorption) to +32.5 µg m-2 h-1 C (emission), and the accumulated emission in the period was related to the soil NH4+-N concentration (R² = 0.82, p < 0.001), probably due to enzymatic competition between nitrification and metanotrophy processes. Despite both of the gas fluxes being affected by N fertilizers, in the average of the treatments, the impact on CH4 emission (0.2 kg ha-1 equivalent CO2-C ) was a hundredfold minor than for N2O (132.8 kg ha-1 equivalent CO2-C). Accounting for the N2O and CH4 emissions plus energetic costs of N fertilizers of 1.3 kg CO2-C kg-1 N regarding the manufacture, transport and application, we estimated an environmental impact of N sources ranging from 220.4 to 664.5 kg ha-1 CO2 -C , which can only be partially offset by C sequestration in the soil, as no study in South Brazil reported an annual net soil C accumulation rate larger than 160 kg ha-1 C due to N fertilization. The N2O mitigation can be obtained by the replacement of N-nitric sources by ammonium and amidic fertilizers. Controlled release N fertilizers and urea with urease inhibitor are also potential alternatives to N2O emission mitigation to atmospheric and systematic studies are necessary to quantify their potential in Brazilian agroecosystems.
Resumo:
Soil organic matter (SOM) plays a crucial role in soil quality and can act as an atmospheric C-CO2 sink under conservationist management systems. This study aimed to evaluate the long-term effects (19 years) of tillage (CT-conventional tillage and NT-no tillage) and crop rotations (R0-monoculture system, R1-winter crop rotation, and R2- intensive crop rotation) on total, particulate and mineral-associated organic carbon (C) stocks of an originally degraded Red Oxisol in Cruz Alta, RS, Southern Brazil. The climate is humid subtropical Cfa 2a (Köppen classification), the mean annual precipitation 1,774 mm and mean annual temperature 19.2 ºC. The plots were divided into four segments, of which each was sampled in the layers 0-0.05, 0.05-0.10, 0.10-0.20, and 0.20-0.30 m. Sampling was performed manually by opening small trenches. The SOM pools were determined by physical fractionation. Soil C stocks had a linear relationship with annual crop C inputs, regardless of the tillage systems. Thus, soil disturbance had a minor effect on SOM turnover. In the 0-0.30 m layer, soil C sequestration ranged from 0 to 0.51 Mg ha-1 yr-1, using the CT R0 treatment as base-line; crop rotation systems had more influence on soil stock C than tillage systems. The mean C sequestration rate of the cropping systems was 0.13 Mg ha-1 yr-1 higher in NT than CT. This result was associated to the higher C input by crops due to the improvement in soil quality under long-term no-tillage. The particulate C fraction was a sensitive indicator of soil management quality, while mineral-associated organic C was the main pool of atmospheric C fixed in this clayey Oxisol. The C retention in this stable SOM fraction accounts for 81 and 89 % of total C sequestration in the treatments NT R1 and NT R2, respectively, in relation to the same cropping systems under CT. The highest C management index was observed in NT R2, confirming the capacity of this soil management practice to improve the soil C stock qualitatively in relation to CT R0. The results highlighted the diversification of crop rotation with cover crops as a crucial strategy for atmospheric C-CO2 sequestration and SOM quality improvement in highly weathered subtropical Oxisols.
Resumo:
Knowledge of the soil physical properties, including the clay content, is of utmost importance for agriculture. The behavior of apparently similar soils can differ in intrinsic characteristics determined by different formation processes and nature of the parent material. The purpose of this study was to assess the efficacy of separate or combined pre-treatments, dispersion methods and chemical dispersant agents to determine clay in some soil classes, selected according to their mineralogy. Two Brazilian Oxisols, two Alfisols and one Mollisol with contrasting mineralogy were selected. Different treatments were applied: chemical substances as dispersants (lithium hydroxide, sodium hydroxide, and hexametaphosphate); pre-treatment with dithionite, ammonium oxalate, and hydrogen peroxide to eliminate organic matter; and coarse sand as abrasive and ultrasound, to test their mechanical action. The conclusion was drawn that different treatments must be applied to determine clay, in view of the soil mineralogy. Lithium hydroxide was not efficient to disperse low-CEC electropositive soils and very efficient in dispersing high-CEC electronegative soils. The use of coarse sand as an abrasive increased the clay content of all soils and in all treatments in which dispersion occurred, with or without the use of chemical dispersants. The efficiency of coarse sand is not the same for all soil classes.
Resumo:
The soil surface roughness increases water retention and infiltration, reduces the runoff volume and speed and influences soil losses by water erosion. Similarly to other parameters, soil roughness is affected by the tillage system and rainfall volume. Based on these assumptions, the main purpose of this study was to evaluate the effect of tillage treatments on soil surface roughness (RR) and tortuosity (T) and to investigate the relationship with soil and water losses in a series of simulated rainfall events. The field study was carried out at the experimental station of EMBRAPA Southeastern Cattle Research Center in São Carlos (Fazenda Canchim), in São Paulo State, Brazil. Experimental plots of 33 m² were treated with two tillage practices in three replications, consisting of: untilled (no-tillage) soil (NTS) and conventionally tilled (plowing plus double disking) soil (CTS). Three successive simulated rain tests were applied in 24 h intervals. The three tests consisted of a first rain of 30 mm/h, a second of 30 mm/h and a third rain of 70 mm/h. Immediately after tilling and each rain simulation test, the surface roughness was measured, using a laser profile meter. The tillage treatments induced significant changes in soil surface roughness and tortuosity, demonstrating the importance of the tillage system for the physical surface conditions, favoring water retention and infiltration in the soil. The increase in surface roughness by the tillage treatments was considerably greater than its reduction by rain action. The surface roughness and tortuosity had more influence on the soil volume lost by surface runoff than in the conventional treatment. Possibly, other variables influenced soil and water losses from the no-tillage treatments, e.g., soil type, declivity, slope length, among others not analyzed in this study.