970 resultados para geometry clean-up


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Bioelectrochemical systems could have potential for bioremediation of contaminants either in situ or ex situ. The treatment of a mixture of phenanthrene and benzene using two different tubular microbial fuel cells (MFCs) designed for either in situ and ex situ applications in aqueous systems was investigated over long operational periods (up to 155 days). For in situ deployments, simultaneous removal of the petroleum hydrocarbons (>90% in term of degradation efficiency) and bromate, used as catholyte, (up to 79%) with concomitant biogenic electricity generation (peak power density up to 6.75 mWm−2) were obtained at a hydraulic retention time (HRT) of 10 days. The tubular MFC could be operated successfully at copiotrophic (100 ppm phenanthrene, 2000 ppm benzene at HRT 30 days) and oligotrophic (phenanthrene and benzene, 50 ppb each, HRT 10 days) substrate conditions suggesting its effectiveness and robustness at extreme substrate concentrations in anoxic environments. In the MFC designed for ex situ deployments, optimum MFC performance was obtained at HRT of 30 h giving COD removal and maximum power output of approximately 77% and 6.75 mWm−2 respectively. The MFC exhibited the ability to resist organic shock loadings and could maintain stable MFC performance. Results of this study suggest the potential use of MFC technology for possible in situ/ex situ hydrocarbon-contaminated groundwater treatment or refinery effluents clean-up, even at extreme contaminant level conditions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: The majority of studies investigated ambient particles, although in most industrialized countries people spend most of their time indoors and significant emissions of fine and ultrafine particles leading to human exposure are caused by various indoor tasks, including cleaning tasks. Objective: To characterize the occupational exposure to particles during cleaning of hotel's rooms. Methodology: Measurements of mass concentration and particle number concentration were performed before and during cleaning tasks in two rooms with different floor types (wood and carpet) with the equipment Lighthouse, model 3016 IAQ. Results: Considering mass concentration, particles with higher were responsable for higher leves of contamination, particularly PM5.0 and PM10.0. However, considering the particle number concentration, the smaller particle size obtained the higher values. Conclusion: It was observed higher number of particles of the smaller size in all tasks, which is associated with worse health effects. It was observed that the room with wood in the floor has lower values when compared to the room with carpet. The tasks with greater exposure were the 'vacuuming' and 'clean up powder'.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

For fifty years (1949–99) the now-abandoned Giant Mine in Yellowknife emitted arsenic air and water pollution into the surrounding environment. Arsenic pollution from Giant Mine had particularly acute health impacts on the nearby Yellowknives Dene First Nation (YKDFN), who were reliant on local lakes, rivers, and streams for their drinking water, in addition to frequent use of local berries, garden produce, and medicine plants. Currently, the Canadian government is undertaking a remediation project at Giant Mine to clean up contaminated soils and tailings on the surface and contain 237,000 tonnes of arsenic dust that are stored underground at the Giant Mine. Using documentary sources and statements of Yellowknives Dene members before various public hearings on the arsenic issue, this paper examines the history of arsenic pollution at Giant Mine as a form of “slow violence,” a concept that reconfigures the arsenic issue not simply as a technical problem, but as a historical agent of colonial dispossession that alienated an Indigenous group from their traditional territory. The long-term storage of arsenic at the former mine site means the effects of this slow violence are not merely historical, but extend to the potentially far distant future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O desenvolvimento de métodos adequados que permitam o monitoramento de resíduos e contaminantes em alimentos é de suma importância pois é a única forma de garantir a segurança dos alimentos evitando danos à saúde do consumidor. Para isso, fazse necessário que estes métodos sejam rápidos, fáceis e de baixo custo, capazes de detectar a presença de resíduos em concentrações baixas e em diferentes matrizes. Este trabalho consistiu no desenvolvimento de método para determinação de 5 sedativos e 14 β-bloqueadores em amostras de rim suíno e posterior análise por Cromatografia Líquida Acoplada à Espectrometria de Massas em Série (LC-MS/MS). O procedimento de extração que melhor se adequou para análise destes compostos consistiu na pesagem de 2 g de amostra e adição de 10 mL de acetonitrila seguida de homogeneização com auxílio de Ultra-Turrax e mesa agitadora. Após extração, as amostras foram submetidas a duas técnicas de clean-up, sendo elas, congelamento do extrato à baixa temperatura e extração em fase sólida dispersiva (d-SPE) utilizando como sorvente Celite® 545. Uma etapa de concentração foi realizada com auxílio de concentrador de amostras sob fluxo de N2 e temperatura controlada. As amostras secas foram retomadas com metanol e analisadas utilizando sistema LC-MS/MS com Ionização por Eletrospray (ESI), operando no modo MRM positivo, coluna Poroshell 120 EC-C18 (3,0 x 50 mm, 2,7 μm) para separação dos analitos, e gradiente de fase móvel composta por (A) solução aquosa acidificada com 0,1% de ácido fórmico (v/v) e (B) metanol 0,1% ácido fórmico (v/v). Os parâmetros de validação avaliados foram linearidade, seletividade, efeito matriz, precisão, veracidade, recuperação, limite de decisão, capacidade de detecção, incerteza da medição, robustez, limite de detecção e de quantificação. Além disso foram observados os critérios de desempenho aplicáveis à detecção por espectrometria de massas e estabilidade dos compostos. A recuperação foi avaliada em 10 μg kg-1 e a veracidade em 5, 10 e 15 μg kg-1 apresentando resultados satisfatórios entre 70 - 85% e 90 - 101%, respectivamente. O limite de quantificação determinado foi de 2,5 μg kg-1 , exceto para carazolol que foi de 1,25 μg kg- 1 . O estudo de linearidade foi realizado entre 0 e 20 μg kg-1 apresentando coeficientes de determinação superiores a 0,98. Estes procedimentos foram realizados através de análise de matriz branca fortificada. Além disso, o presente método foi utilizado para analisar carazolol, azaperone e azaperol em amostras de ensaio colaborativo de rim suíno, apresentando resultados muito próximos aos reais. Portanto, é possível concluir que o método desenvolvido é adequado para análise de sedativos e β-bloqueadores através de extração dos compostos e limpeza do extrato eficientes utilizando procedimentos rápidos, fáceis e de baixo custo, garantindo resultados seguros e confiáveis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Silver and mercury are both dissolved in cyanide leaching and the mercury co-precipitates with silver during metal recovery. Mercury must then be removed from the silver/mercury amalgam by vaporizing the mercury in a retort, leading to environmental and health hazards. The need for retorting silver can be greatly reduced if mercury is selectively removed from leaching solutions. Theoretical calculations were carried out based on the thermodynamics of the Ag/Hg/CN- system in order to determine possible approaches to either preventing mercury dissolution, or selectively precipitating it without silver loss. Preliminary experiments were then carried out based on these calculations to determine if the reaction would be spontaneous with reasonably fast kinetics. In an attempt to stop mercury from dissolving and leaching the heap leach, the first set of experiments were to determine if selenium and mercury would form a mercury selenide under leaching conditions, lowering the amount of mercury in solution while forming a stable compound. From the results of the synthetic ore experiments with selenium, it was determined that another effect was already suppressing mercury dissolution and the effect of the selenium could not be well analyzed on the small amount of change. The effect dominating the reactions led to the second set of experiments in using silver sulfide as a selective precipitant of mercury. The next experiments were to determine if adding solutions containing mercury cyanide to un-leached silver sulfide would facilitate a precipitation reaction, putting silver in solution and precipitating mercury as mercury sulfide. Counter current flow experiments using the high selenium ore showed a 99.8% removal of mercury from solution. As compared to leaching with only cyanide, about 60% of the silver was removed per pass for the high selenium ore, and around 90% for the high mercury ore. Since silver sulfide is rather expensive to use solely as a mercury precipitant, another compound was sought which could selectively precipitate mercury and leave silver in solution. In looking for a more inexpensive selective precipitant, zinc sulfide was tested. The third set of experiments did show that zinc sulfide (as sphalerite) could be used to selectively precipitate mercury while leaving silver cyanide in solution. Parameters such as particle size, reduction potential, and amount of oxidation of the sphalerite were tested. Batch experiments worked well, showing 99.8% mercury removal with only ≈1% silver loss (starting with 930 ppb mercury, 300 ppb silver) at one hour. A continual flow process would work better for industrial applications, which was demonstrated with the filter funnel set up. Funnels with filter paper and sphalerite tested showed good mercury removal (from 31 ppb mercury and 333 ppb silver with a 87% mercury removal and 7% silver loss through one funnel). A counter current flow set up showed 100% mercury removal and under 0.1% silver loss starting with 704 ppb silver and 922 ppb mercury. The resulting sphalerite coated with mercury sulfide was also shown to be stable (not releasing mercury) under leaching tests. Use of sphalerite could be easily implemented through such means as sphalerite impregnated filter paper placed in currently existing processes. In summary, this work focuses on preventing mercury from following silver through the leaching circuit. Currently the only possible means of removing mercury is by retort, creating possible health hazards in the distillation process and in transportation and storage of the final mercury waste product. Preventing mercury from following silver in the earlier stages of the leaching process will greatly reduce the risk of mercury spills, human exposure to mercury, and possible environmental disasters. This will save mining companies millions of dollars from mercury handling and storage, projects to clean up spilled mercury, and will result in better health for those living near and working in the mines.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis presents a study of the Grid data access patterns in distributed analysis in the CMS experiment at the LHC accelerator. This study ranges from the deep analysis of the historical patterns of access to the most relevant data types in CMS, to the exploitation of a supervised Machine Learning classification system to set-up a machinery able to eventually predict future data access patterns - i.e. the so-called dataset “popularity” of the CMS datasets on the Grid - with focus on specific data types. All the CMS workflows run on the Worldwide LHC Computing Grid (WCG) computing centers (Tiers), and in particular the distributed analysis systems sustains hundreds of users and applications submitted every day. These applications (or “jobs”) access different data types hosted on disk storage systems at a large set of WLCG Tiers. The detailed study of how this data is accessed, in terms of data types, hosting Tiers, and different time periods, allows to gain precious insight on storage occupancy over time and different access patterns, and ultimately to extract suggested actions based on this information (e.g. targetted disk clean-up and/or data replication). In this sense, the application of Machine Learning techniques allows to learn from past data and to gain predictability potential for the future CMS data access patterns. Chapter 1 provides an introduction to High Energy Physics at the LHC. Chapter 2 describes the CMS Computing Model, with special focus on the data management sector, also discussing the concept of dataset popularity. Chapter 3 describes the study of CMS data access patterns with different depth levels. Chapter 4 offers a brief introduction to basic machine learning concepts and gives an introduction to its application in CMS and discuss the results obtained by using this approach in the context of this thesis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O atenolol é um fármaco β-bloqueador normalmente encontrado em águas residuais devido à incapacidade que os processos convencionais de tratamento destas águas têm em removê-lo. Neste trabalho foram utilizados microcosmos de leitos construídos de macrófitas de fluxo sub-superficial utilizando uma matriz de argila expandida (LECA) e plantados com Phragmites australis para avaliar a sua capacidade em remover atenolol das águas residuais. Para a detecção e quantificação do atenolol em soluções aquosas (águas e efluentes) desenvolveu-se e optimizou-se uma metodologia analítica usando separação cromatográfica por HPLC e detecção espectrofotométrica por diode array (HPLC-DAD) ou por ultravioleta visível (HPLC-UV-Vis). Desenvolveu-se também um procedimento de limpeza e concentração de amostra por extracção em fase sólida (SPE), o qual foi utilizado sempre que as concentrações do analito se encontraram abaixo dos limites de quantificação do equipamento. A utilização desta metodologia de HPLC, combinada com uma eficaz pré-concentração por SPE, resultou num método analítico com um limite de quantificação muito reduzido (9 ngmL-1) e elevada reprodutibilidade (RSD<4%). A eficiência de remoção de atenolol pelos sistemas de macrófitas estudados foi de 93% após um tempo de retenção de 4 dias. Foram testados leitos só com LECA e com LECA e plantas para remoção do atenolol. Nos leitos só com LECA, a cinética de remoção foi caracterizada por um rápido passo inicial (uma remoção de aproximadamente 75% após apenas 24 h), o qual é frequentemente atribuído à adsorção na matriz de LECA. A remoção de atenolol nos leitos de LECA continuou a aumentar de forma constante até ao final do ensaio (8 dias), sendo, contudo cerca de 5-10% mais baixo do que o valor observado nos leitos das plantas após os 4 primeiros dias. Para o tempo de retenção de 4 dias a maioria do atenolol é removido pela matriz de LECA, porém um acréscimo de cerca de 12-14% relativamente à eficiência de remoção global pode ser atribuído às plantas (Phragmites australis), o que está de acordo com trabalhos anteriormente publicados. Apesar de ser necessário realizar mais testes utilizando sistemas em larga escala, de modo a conseguir avaliar totalmente o comportamento do atenolol num sistema de leitos construídos de macrófitas, o presente estudo apresenta a possibilidade de aplicar este tipo de sistemas, relativamente baratos, no tratamento de águas residuais contaminadas com atenolol. ABSTRACT: Atenolol is a β-blocker drug commonly found in wastewaters due to the inability of the conventional wastewater treatment processes to remove it. ln this study, subsurface flow constructed wetland microscosm systems have been established with a matrix of light expanded clay aggregates (LECA) and planted with Phragmites australis in order to evaluate their ability to remove atenolol from wastewater. For the detection and quantification of atenolol in aqueous solutions (water and wastewater), an adequate analytical methodology was developed and optimized using chromatographic separation by HPLC and diode array (DAD) or UV-Vis spectrophotometric detection. A sample clean-up and preconcentration procedure by solid phase extraction (SPE) was also developed for use whenever the concentration levels of the analyte were below the instrument's limit of quantification. Combined with an efficient SPE concentration step, the use of HPLC yielded an analytical method for atenolol quantification with very low LOQ (9 ngmL-1) and high reproducibility (RSD< 4%). Overall atenolol removal efficiency of 93% was achieved after a retention time of only 4 days with the microcosm systems planted with Phragmites australis. The removal kinetics was characterized by an initial fast step (removal of about 75% after just 24h) which is mainly attributable to adsorption on the LECA matrix. Atenolol removal in LECA beds continues to increase in a steady pace up to the end of the assay (8 days) being nevertheless about 5-l 0% lower than those observed in the planted beds after the first 4 days. For the retention time of 4 days most of the atenolol is removed by the LECA matrix but an additional 12-14% to the overall removal efficiency can be attributed to the Phragmites plants, which comes in agreement with other published reports. Despite the fact that further tests using larger­ scale systems are required to fully evaluate the atenolol behavior in a constructed wetland system, this study points out to the possible application of these low-cost wastewater systems to treat atenolol contaminated wastewater.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O basiodiomiceto Phanerochaeta chrysosporium tem sido proposto para ser usado como agente de biorremediação em áreas contaminadas por compostos poluentes complexos.Este fungo produz lacases e peroxidases, enzimas que normalmente estão envolvidas na degradação de lignina que e uma substancia poliaromática complexa. Estas enzimas são também responsáveis pela degradação de uma diversa faixa de compostos, entre eles alguns pesticidas por exemplo o DDT. O fungicida sistêmico carbendazin (MBC)apesar de relativamente resistente a biodegradação, sofre transformação pela ação de alguns microrganismos. O presente estudo tem por objetivo verificar o efeito do P. chrysosporium na degradação do carbendazin. O fungo foi incubado em meio de cultura liquido (batata-dextrose)enriquecido com 100 ppm de carbenzadin. A determinação quantitativa do fungicida apos 2,3,6 e 22 dias de incubação, foi conduzida por cromatografia liquida de alta eficiência (CLAE) apos extração e "clean-up" da amostra. A analise dos resíduos no meio de cultura, demonstrou que P.chrysosporium degrada 77,6% do carbendazim nos primeiros dias de incubação, permanecendo então este valor inalterado nas analises posteriores. A curva de crescimento dos meios de cultura liquido: BD e BD + MBC, demonstrou em que ambos os casos, o crescimento ótimo foi atingido ao terceiro e quarto dia, respectivamente.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Neste trabalho avaliou-se o potencial alelopático de extratos orgânicos obtidos a partir das folhas de Calopogonium mucunoides sobre a germinação de sementes de algumas plantas daninhas comumente encontradas em áreas de pastagens cultivadas da Amazônia brasileira, as quais causam grandes danos à produtividade: Cassia tora (mata-pasto), Mimosa pudica (malícia) e Cassia occidentalis (fedegoso). Compostos secundários foram identificados e quantificados nos extratos brutos utilizando eletroforese capilar. Após identificar e quantificar os compostos presentes nos extratos realizaram-se novos bioensaios com os padrões dos compostos identificados a fim de verificar se os mesmos poderiam atuar como inibidores na germinação das sementes das plantas daninhas em estudo. Calopogonium mucunoides apresentou potencial alelopático o qual variou com a espécie de planta daninha estudada. Os protocolos desenvolvidos utilizando eletroforese capilar se mostraram eficientes e bastante específicos, sendo possível a separação e identificação de 5 classes de compostos nos extratos brutos sem necessidade de "clean up" ou fracionamento dos mesmos, com análises rápidas (em menos de 20 minutos) e baixas quantidades de solventes utilizadas quando comparadas aos métodos tradicionais de análises. Vários dos compostos identificados apresentaram potencial de inibição de germinação nas sementes estudadas, sendo malícia a mais sensível, os bioensaios também indicaram certo efeito sinérgico ao utilizar a mistura de compostos.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The equilibrium geometry, electronic structure and energetic stability of Bi nanolines on clean and hydrogenated Si(001) surfaces have been examined by means of ab initio total energy calculations and scanning tunnelling microscopy. For the Bi nanolines on a clean Si surface the two most plausible structural models, the Miki or M model (Miki et al 1999 Phys. Rev. B 59 14868) and the Haiku or H model (Owen et al 2002 Phys. Rev. Lett. 88 226104), have been examined in detail. The results of the total energy calculations support the stability of the H model over the M model, in agreement with previous theoretical results. For Bi nanolines on the hydrogenated Si(001) surface, we find that an atomic configuration derived from the H model is also more stable than an atomic configuration derived from the M model. However, the energetically less stable (M) model exhibits better agreement with experimental measurements for equilibrium geometry. The electronic structures of the H and M models are very similar. Both models exhibit a semiconducting character, with the highest occupied Bi-derived bands lying at ~0.5 eV below the valence band maximum. Simulated and experimental STM images confirm that at a low negative bias the Bi lines exhibit an 'antiwire' property for both structural models.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Cardiovascular disease is the leading causes of death in the developed world. Wall shear stress (WSS) is associated with the initiation and progression of atherogenesis. This study combined the recent advances in MR imaging and computational fluid dynamics (CFD) and evaluated the patient-specific carotid bifurcation. The patient was followed up for 3 years. The geometry changes (tortuosity, curvature, ICA/CCA area ratios, central to the cross-sectional curvature, maximum stenosis) and the CFD factors (Velocity distribute, Wall Shear Stress (WSS) and Oscillatory Shear Index (OSI)) were compared at different time points.The carotid stenosis was a slight increase in the central to the cross-sectional curvature, and it was minor and variable curvature changes for carotid centerline. The OSI distribution presents ahigh-values in the same region where carotid stenosis and normal border, indicating complex flow and recirculation.The significant geometric changes observed during the follow-up may also cause significant changes in bifurcation hemodynamics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real‐time kinematic (RTK) GPS techniques have been extensively developed for applications including surveying, structural monitoring, and machine automation. Limitations of the existing RTK techniques that hinder their applications for geodynamics purposes are twofold: (1) the achievable RTK accuracy is on the level of a few centimeters and the uncertainty of vertical component is 1.5–2 times worse than those of horizontal components and (2) the RTK position uncertainty grows in proportional to the base‐torover distances. The key limiting factor behind the problems is the significant effect of residual tropospheric errors on the positioning solutions, especially on the highly correlated height component. This paper develops the geometry‐specified troposphere decorrelation strategy to achieve the subcentimeter kinematic positioning accuracy in all three components. The key is to set up a relative zenith tropospheric delay (RZTD) parameter to absorb the residual tropospheric effects and to solve the established model as an ill‐posed problem using the regularization method. In order to compute a reasonable regularization parameter to obtain an optimal regularized solution, the covariance matrix of positional parameters estimated without the RZTD parameter, which is characterized by observation geometry, is used to replace the quadratic matrix of their “true” values. As a result, the regularization parameter is adaptively computed with variation of observation geometry. The experiment results show that new method can efficiently alleviate the model’s ill condition and stabilize the solution from a single data epoch. Compared to the results from the conventional least squares method, the new method can improve the longrange RTK solution precision from several centimeters to the subcentimeter in all components. More significantly, the precision of the height component is even higher. Several geosciences applications that require subcentimeter real‐time solutions can largely benefit from the proposed approach, such as monitoring of earthquakes and large dams in real‐time, high‐precision GPS leveling and refinement of the vertical datum. In addition, the high‐resolution RZTD solutions can contribute to effective recovery of tropospheric slant path delays in order to establish a 4‐D troposphere tomography.