975 resultados para Cloud point


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The research described in this dissertation is comprised of two major parts. The first part studied the effects of asymmetric amphiphilic end groups on the thermo-response of diblock copolymers of (oligo/di(ethylene glycol) methyl ether (meth)acrylates, OEGA/DEGMA) and the hybrid nanoparticles of these copolymers with a gold nanoparticle core. Placing the more hydrophilic end group on the more hydrophilic block significantly increased the cloud point compared to a similar copolymer composition with the end group placement reversed. For a given composition, the cloud point was shifted by as much as 28 °C depending on the placement of end groups. This is a much stronger effect than either changing the hydrophilic/hydrophobic block ratio or replacing the hydrophilic acrylate monomer with the equivalent methacrylate monomer. The temperature range of the coil-globule transition was also altered. Binding these diblock copolymers to a gold core decreased the cloud point by 5-15 °C and narrowed the temperature range of the coil-globule transition. The effects were more pronounced when the gold core was bound to the less hydrophilic block. Given the limited numbers of monomers that are approved safe for in vivo use, employing amphiphilic end group placement is a useful tool to tune a thermo-response without otherwise changing the copolymer composition. The second part of the dissertation investigated the production of value-added nanomaterials from two biorefinery “wastes”: lignin and peptidoglycan. Different solvents and spinning methods (melt-, wet-, and electro-spinning) were tested to make lignin/cellulose blended and carbonized fibers. Only electro-spinning yielded fibers having a small enough diameter for efficient carbonization ( Peptidoglycan (a bacterial cell wall material) was copolymerized with poly-(3-hydroxybutyrate), a common polyhydroxyalkanoate produced by bacteria with the objective of determining if a useful material could be obtained with a less rigorous work-up on harvesting polyhydroxyalkanoates. The copolyesteramide product having 25 wt.% peptidoglycan from a highly purified peptidoglycan increased thermal stability by 100-200 °C compared to the poly-(3-hydroxybutyrate) control, while a less pure peptidoglycan, harvested from B. megaterium (ATCC 11561), gave a 25-50 °C increase in thermal stability. Both copolymers absorbed more moisture than pure poly-(3-hydroxybutyrate). The results suggest that a less rigorously harvested and purified polyhydroxyalkanoate might be useful for some applications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Babassu and camelina oils have been transesterified with methanol by the classical homogeneous basic catalysis method with good yields. The babassu fatty acid methyl ester (FAME) has been subjected to fractional distillation at vacuum, and the low boiling point fraction has been blended with two types of fossil kerosene, a straight-run atmospheric distillation cut (hydrotreated) and a commercial Jet-A1. The camelina FAME has been blended with the fossil kerosene without previous distillation. The blends of babassu biokerosene and Jet-A1 have met some of the specifications selected for study of the ASTM D1655 standard: smoke point, density, flash point, cloud point, kinematic viscosity, oxidative stability and lower heating value. On the other hand, the blends of babassu biokerosene and atmospheric distillation cut only have met the density parameter and the oxidative stability. The blends of camelina FAME and atmospheric distillation cut have met the following specifications: density, kinematic viscosity at −20 °C, and lower heating value. With these preliminary results, it can be concluded that it would be feasible to blend babassu and camelina biokerosenes prepared in this way with commercial Jet-A1 up to 10 vol % of the former, if these blends prove to accomplish all the ASTM D1655-09 standards.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La presente tesis propone un nuevo método de cartografía de ensayos no destructivos en edificios históricos mediante el uso de técnicas basadas en SIG. Primeramente, se define el método por el cual es posible elaborar y convertir una cartografía 3D basada en nubes de puntos de un elemento arquitectónico obtenida mediante fotogrametría, en cartografía raster y vectorial, legible por los sistemas SIG mediante un sistema de coordenadas particular que referencian cada punto de la nube obtenida por fotogrametría. A esta cartografía inicial la denominaremos cartografía base. Después, se define el método por el cual los puntos donde se realiza un ensayo NDT se referencian al sistema de coordenadas del plano base, lo que permite la generación de cartografías de los ensayos referenciadas y la posibilidad de obtener sobre un mismo plano base diferentes datos de múltiples ensayos. Estas nuevas cartografías las denominaremos cartografías de datos, y se demostrará la utilidad de las mismas en el estudio del deterioro y la humedad. Se incluirá el factor tiempo en las cartografías, y se mostrará cómo este nuevo hecho posibilita el trabajo interdisciplinar en la elaboración del diagnóstico. Finalmente, se generarán nuevas cartografías inéditas hasta entonces consistentes en la combinación de diferentes cartografías de datos con la misma planimetría base. Estas nuevas cartografías, darán pie a la obtención de lo que se ha definido como mapas de isograma de humedad, mapa de isograma de salinidad, factor de humedad, factor de evaporación, factor de salinidad y factor de degradación del material. Mediante este sistema se facilitará una mejor visión del conjunto de los datos obtenidos en el estudio del edificio histórico, lo que favorecerá la correcta y rigurosa interpretación de los datos para su posterior restauración. ABSTRACT This research work proposes a new mapping method of non-destructive testing in historical buildings, by using techniques based on GIS. First of all, the method that makes it possible to produce and convert a 3D map based on cloud points from an architectural element obtained by photogrammetry, are defined, as raster and vector, legible by GIS mapping systems using a particular coordinate system that will refer each cloud point obtained by photogrammetry. This initial mapping will be named base planimetry. Afterwards, the method by which the points where the NDT test is performed are referenced to the coordinate system of the base plane , which allows the generation of maps of the referenced tests and the possibility of obtaining different data from multiple tests on the same base plane. These new maps will be named mapping data and their usefulness will be demonstrated in the deterioration and moisture study. The time factor in maps will be included, and how this new fact will enable the interdisciplinary work in the elaboration of the diagnosis will be proved. Finally, new maps (unpublished until now) will be generated by combining different mapping from the same planimetry data base. These new maps will enable us to obtain what have been called isograma moisture maps, isograma salinity- maps, humidity factor, evaporation factor, salinity factor and the material degradation factor. This system will provide a better vision of all data obtained in the study of historical buildings , and will ease the proper and rigorous data interpretation for its subsequent restoration.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We have studied liquid-liquid phase separation in aqueous ternary solutions of calf lens gamma-crystallin proteins. Specifically, we have examined two ternary systems containing gamma s--namely, gamma IVa with gamma s in water and gamma II with gamma s in water. For each system, the phase-separation temperatures (Tph (phi)) alpha as a function of the overall protein volume fraction phi at various fixed compositions alpha (the "cloud-point curves") were measured. For the gamma IVa, gamma s, and water ternary solution, a binodal curve composed of pairs of coexisting points, (phi I, alpha 1) and (phi II, alpha II), at a fixed temperature (20 degrees C) was also determined. We observe that on the cloud-point curve the critical point is at a higher volume fraction than the maximum phase-separation temperature point. We also find that typically the difference in composition between the coexisting phases is at least as significant as the difference in volume fraction. We show that the asymmetric shape of the cloud-point curve is a consequence of this significant composition difference. Our observation that the phase-separation temperature of the mixtures in the high volume fraction region is strongly suppressed suggests that gamma s-crystallin may play an important role in maintaining the transparency of the lens.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O uso de pesticidas levou ao aumento da produtividade e qualidade dos produtos agrícolas, porém o seu uso acarreta na intoxicação dos seres vivos pela ingestão gradativa de seus resíduos que contaminam o solo, a água e os alimentos. Dessa forma, há a necessidade do monitoramento constante de suas concentrações nos compartimentos ambientais. Para isto, busca-se o desenvolvimento de métodos de extração e enriquecimento de forma rápida, com baixo custo, gerando um baixo volume de resíduos, contribuindo com a química verde. Dentre estes métodos destacam-se a extração por banho de ultrassom e a extração por ponto nuvem. Após o procedimento de extração, o extrato obtido pode ser analisado por técnicas de Cromatografia a Líquido de Alta Eficiência (HPLC) e a Cromatografia por Injeção Sequencial (SIC), empregando fases estacionárias modernas, tais como as monolíticas e as partículas superficialmente porosas. O emprego de SIC com coluna monolítica (C18, 50 x 4,6 mm) e empacotada com partículas superficialmente porosas (C18, 30 x 4,6 mm, tamanho de partícula 2,7 µm) foi estudado para separação de simazina (SIM) e atrazina (ATR), e seus metabólitos, desetilatrazina (DEA), desisopropilatrazina (DIA) e hidroxiatrazina (HAT). A separação foi obtida por eluição passo-a-passo, com fases móveis compostas de acetonitrila (ACN) e tampão Acetato de Amônio/Ácido acético (NH4Ac/HAc) 2,5 mM pH 4,2. A separação na coluna monolítica foi realizada com duas fases móveis: MP1= 15:85 (v v-1) ACN:NH4Ac/HAc e MP2= 35:65 (v v-1) ACN:NH4Ac/HAc a uma vazão de 35 µL s-1. A separação na coluna com partículas superficialmente porosas foi efetivada com as fases móveis MP1= 13:87 (v v-1) ACN: NH4Ac/HAc e MP2= 35:65 (v v-1) ACN:NH4Ac/HAc à vazão de 8 µL s-1. A extração por banho de ultrassom em solo fortificado com os herbicidas (100 e 1000 µg kg-1) resultou em recuperações entre 42 e 160%. A separação de DEA, DIA, HAT, SIM e ATR empregando HPLC foi obtida por um gradiente linear de 13 a 35% para a coluna monolítica e de 10 a 35% ACN na coluna com partículas superficialmente porosas, sendo a fase aquosa constituída por tampão NH4Ac/HAc 2,5 mM pH 4,2. Em ambas as colunas a vazão foi de 1,5 mL min-1 e o tempo de análise 15 min. A extração por banho de ultrassom das amostras de solo com presença de ATR, fortificadas com concentrações de 250 a 1000 µg kg-1, proporcionou recuperações entre 40 e 86%. A presença de ATR foi confirmada por espectrometria de massas. Foram realizados estudos de fortificação com ATR e SIM em amostras de água empregando a extração por ponto nuvem com o surfactante Triton-X114. A separação empregando HPLC foi obtida por um gradiente linear de 13 a 90% de ACN para a coluna monolítica e de 10 a 90% de ACN para a coluna empacotada, sempre em tampão NH4Ac/HAc 2,5 mM pH 4,2. Em ambas as colunas a vazão foi de 1,5 mL min-1 e o tempo de análise 16 min. Fortificações entre 1 e 50 µg L-1 resultaram em recuperações entre 65 e 132%.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Produced water is a major problem associated with the crude oil extraction activity. The monitoring of the levels of metals in the waste is constant and requires the use of sensitive analytical techniques. However, the determination of trace elements can often require a pre-concentration step. The objective of this study was to develop a simple and rapid analytical method for the extraction and pre-concentration based on extraction phenomenon cloud point for the determination of Cd, Pb and Tl in produced water samples by spectrometry of high resolution Absorption source continues and atomization graphite furnace. The Box Behnken design was used to obtain the optimal condition of extraction of analytes. The factors were evaluated: concentration of complexing agent (o,o-dietilditilfosfato ammonium, DDTP), the concentration of hydrochloric acid and concentration of surfactant (Triton X -114). The optimal condition obtained through extraction was: 0,6% m v-1 DDTP, HCl 0,3 mol L-1 and 0,2% m v-1 of Triton X - 114 for Pb; 0,7% m v-1 DDTP, HCl 0,8 mol L-1 and 0,2% m v-1 Triton X-114 for Cd. For Tl was evidenced that best extraction condition occurs with no DDTP, the extraction conditions were HCl 1,0 mol L-1 e 1,0% m v-1 de Triton X - 114. The limits of detection for the proposed method were 0,005 µg L-1 , 0,03 µg L-1 and 0,09 µg L-1 to Cd, Pb and Tl, Respectively. Enrichment factors Were greater than 10 times. The method was applied to the water produced in the Potiguar basin, and addition and recovery tests were performed, and values were between 81% and 120%. The precision was expressed with relative standard deviation (RSD) is less than 5%

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The textile sector is one of the main contributors to the generation of industrial wastewaters due to the use of large volumes of water, which has a high organic load content. In these, it is observed to the presence of dyes, surfactants, starch, alcohols, acetic acid and other constituents, from the various processing steps of the textiles. Hence, the treatment of textile wastewater becomes fundamental before releasing it into water bodies, where they can cause disastrous physical-chemical changes for the environment. Surfactants are substances widely used in separation processes and their use for treating textile wastewaters was evaluated in this research by applying the cloud point extraction and the ionic flocculation. In the cloud point extraction was used as surfactant nonylphenol with 9.5 ethoxylation degree to remove reactive dye. The process evaluation was performed in terms of temperature, surfactant and dye concentrations. The dye removal reached 91%. The ionic flocculation occurs due to the presence of calcium, which reacts with anionic surfactant to form insoluble surfactants capable of attracting the organic matter by adsorption. In this work the ionic flocculation using base soap was applied to the treatment of synthetic wastewater containing dyes belonging to three classes: direct, reactive, and disperse. It was evaluated by the influence of the following parameters: surfactant and electrolyte concentrations, stirring speed, equilibrium time, temperature, and pH. The flocculation of the surfactant was carried out in two ways: forming the floc in the effluent itself and forming the floc before mixing it to the effluent. Removal of reactive and direct dye, when the floc is formed into textile effluent was 97% and 87%, respectively. In the case where the floc is formed prior to adding it to the effluent, the removal to direct and disperse dye reached 92% and 87%, respectively. These results show the efficience of the evaluated processes for dye removal from textile wastewaters.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis presents the synthesis, characterization and study of the associative behaviour in aqueous media of new responsive graft copolymers, based on carboxymethylcellulose as the water-soluble backbone and Jeffamine® M-2070 e Jeffamine® M-600 (commercial polyetheramines) as the thermoresponsive grafts with high cloud point temperatures in water. The synthesis was performed on aqueous medium, by using 1-ethyl-3- (3-(dimethylamino)-propyl)carbodiimide hydrochloride and N-hydroxysuccinimide as activators of the reaction between carboxylategroupsfrom carboxymethylcellulose and amino groups from polyetheramines. The grafting reaction was confirmed by infrared spectroscopy and the grafting percentage by 1H NMR. The molar mass of the polyetheramines was determined by 1H NMR, whereas the molar mass of CMC and graft copolymers was determined by static light scattering. The salt effect on the association behaviour of the copolymers was evaluated in different aqueous media (Milli-Q water, 0.5M NaCl, 0.5M K2CO3 and synthetic sea water), at different temperatures, through UV-vis, rheology and dynamic light scattering. None of the copolymers solutions, at 5 g/L, turned turbid in Milli-Q water when heated from 25 to 95 °C, probably because of the increase in hydrophibicity promoted by CMC backbone. However, they became turbid in the presence of salts, due to the salting out effect, where the lowest cloud point was observed in 0.5M K2CO3, which was attributed to the highest ionic strength in water, combined to the ability of CO3 2- to decrease polymer-solvents interactions. The hydrodynamic radius and apparent viscosity of the copolymers in aqueous medium changed as a function of salts dissolved in the medium, temperature and copolymer composition. Thermothickening behaviour was observed in 0.5M K2CO3 when the temperature was raised from 25 to 60°C. This performance can be attributed to intermolecular associations as a physical network, since the temperature is above the cloud point of the copolymers in this solvent.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In semisupervised learning (SSL), a predictive model is learn from a collection of labeled data and a typically much larger collection of unlabeled data. These paper presented a framework called multi-view point cloud regularization (MVPCR), which unifies and generalizes several semisupervised kernel methods that are based on data-dependent regularization in reproducing kernel Hilbert spaces (RKHSs). Special cases of MVPCR include coregularized least squares (CoRLS), manifold regularization (MR), and graph-based SSL. An accompanying theorem shows how to reduce any MVPCR problem to standard supervised learning with a new multi-view kernel.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Computer generated holography is an extremely demanding and complex task when it comes to providing realistic reconstructions with full parallax, occlusion, and shadowing. We present an algorithm designed for data-parallel computing on modern graphics processing units to alleviate the computational burden. We apply Gaussian interpolation to create a continuous surface representation from discrete input object points. The algorithm maintains a potential occluder list for each individual hologram plane sample to keep the number of visibility tests to a minimum.We experimented with two approximations that simplify and accelerate occlusion computation. It is observed that letting several neighboring hologramplane samples share visibility information on object points leads to significantly faster computation without causing noticeable artifacts in the reconstructed images. Computing a reduced sample set via nonuniform sampling is also found to be an effective acceleration technique. © 2009 Optical Society of America.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The commercial far-range (>10 m) spatial data collection methods for acquiring infrastructure’s geometric data are not completely automated because of the necessary manual pre- and/or post-processing work. The required amount of human intervention and, in some cases, the high equipment costs associated with these methods impede their adoption by the majority of infrastructure mapping activities. This paper presents an automated stereo vision-based method, as an alternative and inexpensive solution, to producing a sparse Euclidean 3D point cloud of an infrastructure scene utilizing two video streams captured by a set of two calibrated cameras. In this process SURF features are automatically detected and matched between each pair of stereo video frames. 3D coordinates of the matched feature points are then calculated via triangulation. The detected SURF features in two successive video frames are automatically matched and the RANSAC algorithm is used to discard mismatches. The quaternion motion estimation method is then used along with bundle adjustment optimization to register successive point clouds. The method was tested on a database of infrastructure stereo video streams. The validity and statistical significance of the results were evaluated by comparing the spatial distance of randomly selected feature points with their corresponding tape measurements.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Automating the model generation process of infrastructure can substantially reduce the modeling time and cost. This paper presents a method to generate a sparse point cloud of an infrastructure scene using a single video camera under practical constraints. It is the first step towards establishing an automatic framework for object-oriented as-built modeling. Motion blur and key frame selection criteria are considered. Structure from motion and bundle adjustment are explored. The method is demonstrated in a case study where the scene of a reinforced concrete bridge is videotaped, reconstructed, and metrically validated. The result indicates the applicability, efficiency, and accuracy of the proposed method.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Most of the manual labor needed to create the geometric building information model (BIM) of an existing facility is spent converting raw point cloud data (PCD) to a BIM description. Automating this process would drastically reduce the modeling cost. Surface extraction from PCD is a fundamental step in this process. Compact modeling of redundant points in PCD as a set of planes leads to smaller file size and fast interactive visualization on cheap hardware. Traditional approaches for smooth surface reconstruction do not explicitly model the sparse scene structure or significantly exploit the redundancy. This paper proposes a method based on sparsity-inducing optimization to address the planar surface extraction problem. Through sparse optimization, points in PCD are segmented according to their embedded linear subspaces. Within each segmented part, plane models can be estimated. Experimental results on a typical noisy PCD demonstrate the effectiveness of the algorithm.