895 resultados para Environmental Exposure.
Resumo:
Biochar is the solid C-rich matrix obtained by pyrolysis of biomasses, currently promoted as a soil amendment with the aim to offset anthropogenic C emissions, while ameliorating soil properties and growth conditions. Benefits from biochar seem promising, although scientific understandings are beginning to be explored. In this project, I performed a suite of experiments in controlled and in field conditions with the aims to investigate the effect of biochar on: a) the interaction with minerals; b) Fe nutrition in kiwifruit; c) soil leaching, soil fertility, soil CO2 emissions partitioning, soil bacterial profile and key gene expression of soil nitrification-involved bacteria; d) plant growth, nutritional status, yield, fruit quality and e) its physical-chemical changes as affected by long-term environmental exposure. Biochar released K, P and Mg but retained Fe, Mn, Cu and Zn on its surface which in turn hindered Fe nutrition of kiwifruit trees. A redox reaction on the biochar surface exposed to a Fe source was elucidated. Biochar reduced the amount of leached NH4+-N but increased that of Hg, K, P, Mo, Se and Sn. Furthermore, biochar synergistically interacted with compost increasing soil field capacity, fertility, leaching of DOC, TDN and RSOC, suggesting a priming effect. However, in field conditions, biochar did not affect yield, nutritional status and fruit quality. Actinomadura flavalba, Saccharomonospora viridis, Thermosporomyces composti and Enterobacter spp. were peculiar of the soil amended with biochar plus compost which exhibited the highest band richness and promoted gene expression levels of Nitrosomonas spp., Nitrobacter spp. and enzymatic-related activity. Environmental exposure reduced C, K, pH and water infiltration of biochar which instead resulted in a higher O, Si, N, Na, Al, Ca, Mn and Fe at%. Oxidation occurred on the aged biochar surface, it decreased progressively with depth and induced the development of O-containing functional groups, up to 75nm depth.
Resumo:
Waterproofing agents are widely used to protect leather and textiles in both domestic and occupational activities. An outbreak of acute respiratory syndrome following exposure to waterproofing sprays occurred during the winter 2002-2003 in Switzerland. About 180 cases were reported by the Swiss Toxicological Information Centre between October 2002 and March 2003, whereas fewer than 10 cases per year had been recorded previously. The reported cases involved three brands of sprays containing a common waterproofing mixture, that had undergone a formulation change in the months preceding the outbreak. A retrospective analysis was undertaken in collaboration with the Swiss Toxicological Information Centre and the Swiss Registries for Interstitial and Orphan Lung Diseases to clarify the circumstances and possible causes of the observed health effects. Individual exposure data were generated with questionnaires and experimental emission measurements. The collected data was used to conduct numeric simulation for 102 cases of exposure. A classical two-zone model was used to assess the aerosol dispersion in the near- and far-field during spraying. The resulting assessed dose and exposure levels obtained were spread on large scales, of several orders of magnitude. No dose-response relationship was found between exposure indicators and health effects indicators (perceived severity and clinical indicators). Weak relationships were found between unspecific inflammatory response indicators (leukocytes, C-reactive protein) and the maximal exposure concentration. The results obtained disclose a high interindividual response variability and suggest that some indirect mechanism(s) predominates in the respiratory disease occurrence. Furthermore, no threshold could be found to define a safe level of exposure. These findings suggest that the improvement of environmental exposure conditions during spraying alone does not constitute a sufficient measure to prevent future outbreaks of waterproofing spray toxicity. More efficient preventive measures are needed prior to the marketing and distribution of new waterproofing agents.
Resumo:
Sick Building Syndrome is a prevalent problem with patient complaints similar to typical allergy symptoms. Unlike most household allergens, the Asp f 1 allergen is conceivably ubiquitous in the work environment. This project examined levels of the Asp f 1 allergen in office and non-industrial occupational environments, and studied the bioaerosol and dust reservoirs of Aspergillus fumigatus responsible for those levels. ^ Culturable bioaerosols of total mesophilic fungi were sampled with Andersen N6 impactors. Aggressive airborne and bulk dust samples were concurrently collected and assayed for Asp f 1. Bulk dusts were selectively cultured for A. fumigatus. Samples were collected during both wet and dry climatological conditions to examine the possibility of Asp f 1 increases due to fungal growth blooms. ^ Only very low levels of Asp f 1 were detected in relatively few samples. Analysis of wet versus dry period samples showed no differences in Asp f 1 levels, although A. fumigatus counts from dusts did fluctuate significantly with exterior moisture events as did indoor prevalence of total colony forming units. These results indicate that even in the presence of elevated fungal concentrations, levels of Asp f 1 are extremely low. These levels do not correlate with climatological moisture events, despite distinct fungal blooms in the days immediately following those events. Non-industrial office buildings devoid of indoor air quality issues did not demonstrate significant levels or occurrence of Asp f 1 contamination in the geographical region of this study. ^
Resumo:
The production and use of carbon nanotubes (CNTs) can negatively impact human health and the environment through occupational, environmental, and product life-cycle exposures. Research is underway to evaluate the known, potential, and perceived hazards associated with CNTs. Recent research and policy analyses regarding CNTs were reviewed extensively. A facility engaged in research, development, and manufacture of CNTs was observed handling CNTs and associated individuals were informally interviewed. The combined investigation characterizes the current state of the art of our understanding and implementation of policy needed to address the impacts of CNTs to human health and the environment. A gap analysis is performed of regulations, policy, and CNT control methods; conclusions and recommendations are made from the results of this analysis.
Resumo:
PB 250 424.
Resumo:
Relationships between cadmium (Cd) body burden, kidney function and coumarin metabolism were investigated using two groups of 197 and 200 healthy Thais with men and women in nearly equal numbers. A mean age of one group was 30.5 years and it was 39.3 years for the other group. Of 397, 20 subjects (5%) excreted urine Cd between 1.4 mug/g and 3.8 mug/g creatinine and these subjects faced 10-15% increase in the probability of having abnormal urinary excretion of N-acetyl-beta-D-glucosaminidase (NAG-uria). The prevalence of NAG-uria varied with Cd body burden in a dose-dependent manner (chi(2) = 22, P < 0.008). Also NAG-nuria was one of the three kidney effect markers tested that showed the greatest strength of correlation with urine Cd in both men and women (r = 0.48 P < 0.001). In addition, urine Cd excretion of men and women showed a positive correlation (r = 0.46 to 0.54. P < 0.001) with urine 7-hydroxycoumarin (7-OHC) excretion which was used as a marker of liver cytochrome P450 2A6 (CYP2A6) enzyme activity. Urinary CA excretion accounted for 25% of the total variation in urine 7-OHC excretion (P < 0.001). These data suggest that Cd may increase the expression of CYP2A6 in liver, resulting in enhanced coumarin metabolism in subjects with high Cd body burden. (C) 2003 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Effects of cigarette smoking and exposure to dietary cadmium (Cd) and lead (Pb) on urinary biomarkers of renal function and phenotypic variability of cytochrome P450 2A6 (CYP2A6) were investigated in a group of 96 healthy Thai men with mean age of 36.7 year (19-57 years). In non-smokers, Cd burden increased with age (r = 0.47, P < 0.001). In current smokers, Cd burden increased with both age (r = 0.45, P = 0.01) and number of cigarettes smoked per day (r = 0.32, P = 0.05). Cd-linked renal tubular dysfunction was seen in both smokers and non-smokers, but Pb-linked glomerular dysfunction was seen in smokers only, possibly due to more recent exposure to high levels of Cd and Pb, as reflected by 30-50% higher serum Cd and Pb levels in smokers than non-smokers (P < 0.05). Exposure to dietary Cd and Pb appeared to be associated with mild tubular dysfunction whereas dietary exposure plus cigarette smoking was associated with tubular plus glomerular dysfunction. Hepatic CYP2A6 activity in non-smokers showed a positive association with Cd burden (adjusted P = 0.38, P = 0.006), but it showed an inverse correlation with Pb (adjusted beta = -0.29, P = 0.003), suggesting opposing effects of Cd and Pb on hepatic CYP2A6 phenotype. In contrast, CYP2A6 activity in current smokers did not correlate with Cd or Pb, but it showed a positive correlation with serum ferritin levels (r = 0.45, P = 0.01). These finding suggest that Pb concentrations in the liver probably were too low to inhibit hepatic synthesis of heme and CYP2A6 and that the concurrent induction of hepatic CYP2A6 and ferritin was probably due to cigarette smoke constituents other than the Cd and Pb. (C) 2004 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Recent epidemiological evidences indicate that arsenic exposure increases risk of atherosclerosis, cardio vascular diseases (CVD) such as hypertension, atherosclerosis, coronary artery disease (CAD) and microangiopathies in addition to the serious global health concern related to its carcinogenic effects. In experiments on animals, acute and chronic exposure to arsenic directly correlates with cardiac tachyarrhythmia, and atherogenesis in a concentration and duration dependent manner. Moreover, the other effects of long-term arsenic exposure include induction of non-insulin dependent diabetes by mechanisms yet to be understood. On the other hand, there are controversial issues, gaps in knowledge, and future research priorities in accelerated incidences of CVD and mortalities in patients with HIV who are under long-termanti-retroviral therapy (ART). Although, both HIV infection itself and various components of ART initiate significant pathological alterations in the myocardium and the vasculature, simultaneous environmental exposure to arsenic which is more convincingly being recognized as a facilitator of HIV viral cycling in the infected immune cells, may contribute an additional layer of adversity in these patients. A high degree of suspicion and early screening may allow appropriate interventional guidelines to improve the quality of lives of those affected. In this mini-review which have been fortified with our own preliminary data, we will discuss some of the key current understating of chronic arsenic exposure, and its possible impact on the accelerated HIV/ART induced CVD. The review will conclude with notes on recent developments in mathematical modeling in this field that probabilistically forecast incidence prevalence as functions of aging and life style parameters, most of which vary with time themselves; this interdisciplinary approach provides a complementary kernel to conventional biology.
Resumo:
Studies have shown that the environmental conditions of the home are important predictors of health, especially in low-income communities. Understanding the relationship between the environment and health is crucial in the management of certain diseases. One health outcome related to the home environment among urban, minority, and low-income children is childhood lead poisoning. The most common sources of lead exposure for children are lead paint in older, dilapidated housing and contaminated dust and soil produced by accumulated residue of leaded gasoline. Blood lead levels (BLL) as low as 10 μg/dL in children are associated with impaired cognitive function, behavior difficulties, and reduced intelligence. Recently, it is suggested that the standard for intervention be lowered to BLL of 5 μg /dl. The objectives of our report were to assess the prevalence of lead poisoning among children under six years of age and to quantify and test the correlations between BLL in children and lead exposure levels in their environment. This cross-sectional analysis was restricted to 75 children under six years of age who lived in 6 zip code areas of inner city Miami. These locations exhibited unacceptably high levels of lead dust and soil in areas where children live and play. Using the 5 μg/dL as the cutoff point, the prevalence of lead poisoning among the study sample was 13.33%. The study revealed that lead levels in floor dust and window sill samples were positively and significantly correlated with BLL among children (p < 0.05). However, the correlations between BLL and the soil, air, and water samples were not significant. Based on this pilot study, a more comprehensive environmental study in surrounding inner city areas is warranted. Parental education on proper housecleaning techniques may also benefit those living in the high lead-exposed communities of inner city Miami.
Resumo:
Bisphenol A (BPA), 2,2-bis(4-hydroxyphenyl) propane one is of the greatest volume industrial chemicals utilized in the world with increased production every year. Environmental exposure to this xenoestrogen is considered a generalized phenomenon with a Tolerable Daily Intake (TDI) of4 µg/kg body weight/day established by the European Food Safety Authority. Several studies have focused in estimate human daily intake and potential associated health effects of environmental exposures, however despite of the massive BPA production and consumption in European countries, with policarbonate and epoxy resins as the major applications, occupational exposure to BPA have been overlooked and considered safe by the European authorities.
Resumo:
Resumo:
Resumo:
1. Essential hypertension occurs in people with an underlying genetic predisposition who subject themselves to adverse environmental influences. The number of genes involved is unknown, as is the extent to which each contributes to final blood pressure and the severity of the disease. 2. In the past, studies of potential candidate genes have been performed by association (case-control) analysis of unrelated individuals or linkage (pedigree or sibpair) analysis of families. These studies have resulted in several positive findings but, as one may expect, also an enormous number of negative results. 3. In order to uncover the major genetic loci for essential hypertension, it is proposed that scanning the genome systematically in 100- 200 affected sibships should prove successful. 4. This involves genotyping sets of hypertensive sibships to determine their complement of several hundred microsatellite polymorphisms. Those that are highly informative, by having a high heterozygosity, are most suitable. Also, the markers need to be spaced sufficiently evenly across the genome so as to ensure adequate coverage. 5. Tests are performed to determine increased segregation of alleles of each marker with hypertension. The analytical tools involve specialized statistical programs that can detect such differences. Non- parametric multipoint analysis is an appropriate approach. 6. In this way, loci for essential hypertension are beginning to emerge.
Resumo:
Technical dinitrotoluene (DNT) is a mixture of 2,4- and 2,6-DNT. In humans, industrial or environmental exposure can occur orally, by inhalation, or by skin contact. The classification of DNT as an 'animal carcinogen' is based on the formation of malignant tumors in kidneys, liver, and mammary glands of rats and mice. Clear signs of toxic nephropathy were found in rats dosed with DNT, and the concept was derived of an interrelation between renal toxicity and carcinogenicity. Recent data point to the carcinogenicity of DNT on the urinary tract of exposed humans. Between 1984 and 1997, 6 cases of urothelial cancer and 14 cases of renal cell cancer were diagnosed in a group of 500 underground mining workers in the copper mining industry of the former GDR and having high exposures to explosives containing technical DNT. The incidences of both urothelial and renal cell tumors in this group were 4.5 and 14.3 times higher, respectively, than anticipated on the basis of the cancer registers of the GDR. The genotyping of all identified tumor patients for the polymorphic enzymes NAT2, GSTM1, and GSTT1 identified the urothelial tumor cases as exclusively 'slow acetylates'. A group of 161 miners highly exposed to DNT was investigated for signs of subclinical renal damage. The exposures were categorized semi-quantitatively into 'low', 'medium', 'high', and 'very high'. A straight dose-dependence of the excretion of urinary biomarker proteins with the ranking of exposure was seen. Biomarker excretion (alpha1-microglobulin, glutathione S-transferases alpha and pi) indicated that DNT-induced damage was directed toward the tubular system. New data on DNT-exposed humans appear consistent with the concept of cancer initiation by DNT isomers and the subsequent promotion of renal carcinogenesis by selective damage to the proximal tubule. The differential pathways of metabolic activation of DNT appear to apply to the proximal tubule of the kidney and to the urothelium of the renal pelvis and lower urinary tract as target tissues of carcinogenicity.
Resumo:
Introduction Risk factor analyses for nosocomial infections (NIs) are complex. First, due to competing events for NI, the association between risk factors of NI as measured using hazard rates may not coincide with the association using cumulative probability (risk). Second, patients from the same intensive care unit (ICU) who share the same environmental exposure are likely to be more similar with regard to risk factors predisposing to a NI than patients from different ICUs. We aimed to develop an analytical approach to account for both features and to use it to evaluate associations between patient- and ICU-level characteristics with both rates of NI and competing risks and with the cumulative probability of infection. Methods We considered a multicenter database of 159 intensive care units containing 109,216 admissions (813,739 admission-days) from the Spanish HELICS-ENVIN ICU network. We analyzed the data using two models: an etiologic model (rate based) and a predictive model (risk based). In both models, random effects (shared frailties) were introduced to assess heterogeneity. Death and discharge without NI are treated as competing events for NI. Results There was a large heterogeneity across ICUs in NI hazard rates, which remained after accounting for multilevel risk factors, meaning that there are remaining unobserved ICU-specific factors that influence NI occurrence. Heterogeneity across ICUs in terms of cumulative probability of NI was even more pronounced. Several risk factors had markedly different associations in the rate-based and risk-based models. For some, the associations differed in magnitude. For example, high Acute Physiology and Chronic Health Evaluation II (APACHE II) scores were associated with modest increases in the rate of nosocomial bacteremia, but large increases in the risk. Others differed in sign, for example respiratory vs cardiovascular diagnostic categories were associated with a reduced rate of nosocomial bacteremia, but an increased risk. Conclusions A combination of competing risks and multilevel models is required to understand direct and indirect risk factors for NI and distinguish patient-level from ICU-level factors.