872 resultados para ONE-STEP SYNTHESIS
Resumo:
The textile industry generates a large volume of high organic effluent loading whoseintense color arises from residual dyes. Due to the environmental implications caused by this category of contaminant there is a permanent search for methods to remove these compounds from industrial waste waters. The adsorption alternative is one of the most efficient ways for such a purpose of sequestering/remediation and the use of inexpensive materials such as agricultural residues (e.g., sugarcane bagasse) and cotton dust waste (CDW) from weaving in their natural or chemically modified forms. The inclusion of quaternary amino groups (DEAE+) and methylcarboxylic (CM-) in the CDW cellulosic structure generates an ion exchange capacity in these formerly inert matrix and, consequently, consolidates its ability for electrovalent adsorption of residual textile dyes. The obtained ionic matrices were evaluated for pHpcz, the retention efficiency for various textile dyes in different experimental conditions, such as initial concentration , temperature, contact time in order to determine the kinetic and thermodynamic parameters of adsorption in batch, turning comprehensive how does occur the process, then understood from the respective isotherms. It was observed a change in the pHpcz for CM--CDW (6.07) and DEAE+-CDW (9.66) as compared to the native CDW (6.46), confirming changes in the total surface charge. The ionized matrices were effective for removing all evaluated pure or residual textile dyes under various tested experimental conditions. The kinetics of the adsorption process data had best fitted to the model a pseudosecond order and an intraparticle diffusion model suggested that the process takes place in more than one step. The time required for the system to reach equilibrium varied according to the initial concentration of dye, being faster in diluted solutions. The isotherm model of Langmuir was the best fit to the experimental data. The maximum adsorption capacity varied differently for each tested dye and it is closely related to the interaction adsorbent/adsorbate and dye chemical structure. Few dyes obtained a linear variation of the balance ka constant due to the inversion of temperature and might have influence form their thermodynamic behavior. Dyes that could be evaluated such as BR 18: 1 and AzL, showed features of an endothermic adsorption process (ΔH° positive) and the dye VmL presented exothermic process characteristics (ΔH° negative). ΔG° values suggested that adsorption occurred spontaneously, except for the BY 28 dye, and the values of ΔH° indicated that adsorption occurred by a chemisorption process. The reduction of 31 to 51% in the biodegradability of the matrix after the dye adsorption means that they must go through a cleaning process before being discarded or recycled, and the regeneration test indicates that matrices can be reused up to five times without loss of performance. The DEAE+-CDW matrix was efficient for the removal of color from a real textile effluent reaching an UV-Visible spectral area decrease of 93% when applied in a proportion of 15 g ion exchanger matrix L-1 of colored wastewater, even in the case of the parallel presence of 50 g L-1 of mordant salts in the waste water. The wide range of colored matter removal by the synthesized matrices varied from 40.27 to 98.65 mg g-1 of ionized matrix, obviously depending in each particular chemical structure of the dye upon adsorption.
Resumo:
The main objective of the present thesis consists on the development of an analytical preconcentration technology for the concomitant extraction and concentration of human pollution tracers from wastewater streams. Due to the outstanding tunable properties of ionic liquids (ILs), aqueous biphasic systems (ABS) composed of ILs can provide higher and more selective extraction efficiencies for a wide range of compounds, being thus a promising alternative to the volatile and hazardous organic solvents (VOCs) typically used. For that purpose, IL-based ABS were employed and adequately characterized as an one-step extraction and concentration technique. The applicability of IL-based ABS was verified by their potential to completely extract and concentrate two representative pharmaceutical pollution tracers, namely caffeine (CAF) and carbamazepine (CBZ), from wastewaters. The low concentration of these persistent pollutants (usually found in μg·dm-3 and ng·dm-3 levels, respectively) by conventional analytical equipment does not permit a proper detection and quantification without a previous concentration step. Preconcentration methods commonly applied are costly, timeconsuming, with irregular recoveries and make use of VOCs. In this work, the ABS composed of the IL tetrabutylammonium chloride ([N4444]Cl) and the salt potassium citrate (K3[C6H5O7]) was investigated while demonstrating to be able to completely extract and concentrate CAF and CBZ, in a single-step, overcoming thus the detection limit of the applied analytical equipment. Finally, the hydrotropic effect responsible for the ability of IL-based ABS to extract and concentrate a wide variety of compounds was also investigated. It was shown that the IL rules the hydrotropic mechanism in the solubility of CAF in aqueous solutions, with an increase in solubility up to 4-fold. Moreover, the proper selection of the IL enables the design of the system that leads to a more enhanced solubility of a given solute in the IL-rich phase, while allowing a better extraction and concentration. IL-based ABS are a promising and more versatile technique, and are straightforwardly envisaged as selective extraction and concentration routes of target micropollutants from wastewater matrices.
Resumo:
Nanotechnology has revolutionised humanity's capability in building microscopic systems by manipulating materials on a molecular and atomic scale. Nan-osystems are becoming increasingly smaller and more complex from the chemical perspective which increases the demand for microscopic characterisation techniques. Among others, transmission electron microscopy (TEM) is an indispensable tool that is increasingly used to study the structures of nanosystems down to the molecular and atomic scale. However, despite the effectivity of this tool, it can only provide 2-dimensional projection (shadow) images of the 3D structure, leaving the 3-dimensional information hidden which can lead to incomplete or erroneous characterization. One very promising inspection method is Electron Tomography (ET), which is rapidly becoming an important tool to explore the 3D nano-world. ET provides (sub-)nanometer resolution in all three dimensions of the sample under investigation. However, the fidelity of the ET tomogram that is achieved by current ET reconstruction procedures remains a major challenge. This thesis addresses the assessment and advancement of electron tomographic methods to enable high-fidelity three-dimensional investigations. A quality assessment investigation was conducted to provide a quality quantitative analysis of the main established ET reconstruction algorithms and to study the influence of the experimental conditions on the quality of the reconstructed ET tomogram. Regular shaped nanoparticles were used as a ground-truth for this study. It is concluded that the fidelity of the post-reconstruction quantitative analysis and segmentation is limited, mainly by the fidelity of the reconstructed ET tomogram. This motivates the development of an improved tomographic reconstruction process. In this thesis, a novel ET method was proposed, named dictionary learning electron tomography (DLET). DLET is based on the recent mathematical theorem of compressed sensing (CS) which employs the sparsity of ET tomograms to enable accurate reconstruction from undersampled (S)TEM tilt series. DLET learns the sparsifying transform (dictionary) in an adaptive way and reconstructs the tomogram simultaneously from highly undersampled tilt series. In this method, the sparsity is applied on overlapping image patches favouring local structures. Furthermore, the dictionary is adapted to the specific tomogram instance, thereby favouring better sparsity and consequently higher quality reconstructions. The reconstruction algorithm is based on an alternating procedure that learns the sparsifying dictionary and employs it to remove artifacts and noise in one step, and then restores the tomogram data in the other step. Simulation and real ET experiments of several morphologies are performed with a variety of setups. Reconstruction results validate its efficiency in both noiseless and noisy cases and show that it yields an improved reconstruction quality with fast convergence. The proposed method enables the recovery of high-fidelity information without the need to worry about what sparsifying transform to select or whether the images used strictly follow the pre-conditions of a certain transform (e.g. strictly piecewise constant for Total Variation minimisation). This can also avoid artifacts that can be introduced by specific sparsifying transforms (e.g. the staircase artifacts the may result when using Total Variation minimisation). Moreover, this thesis shows how reliable elementally sensitive tomography using EELS is possible with the aid of both appropriate use of Dual electron energy loss spectroscopy (DualEELS) and the DLET compressed sensing algorithm to make the best use of the limited data volume and signal to noise inherent in core-loss electron energy loss spectroscopy (EELS) from nanoparticles of an industrially important material. Taken together, the results presented in this thesis demonstrates how high-fidelity ET reconstructions can be achieved using a compressed sensing approach.
Análise de volatilidade, integração de preços e previsibilidade para o mercado brasileiro de camarão
Resumo:
The present paper has the purpose of investigate the dynamics of the volatility structure in the shrimp prices in the Brazilian fish market. Therefore, a description of the initial aspects of the shrimp price series was made. From this information, statistics tests were made and selected univariate models to be price predictors. Then, it was verified the existence of relationship of long-term equilibrium between the Brazilian and American imported shrimp and if, confirmed the relationship, whether or not there is a causal link between these assets, considering that the two countries had presented trade relations over the years. It is presented as an exploratory research of applied nature with quantitative approach. The database was collected through direct contact with the Companhia de Entrepostos e Armazéns Gerais de São Paulo (CEAGESP) and on the official website of American import, National Marine Fisheries Service - National Oceanic and Atmospheric Administration (NMFS- NOAA). The results showed that the great variability in the active price is directly related with the gain and loss of the market agents. The price series presents a strong seasonal and biannual effect. The average structure of price of shrimp in the last 12 years was R$ 11.58 and external factors besides the production and marketing (U.S. antidumping, floods and pathologies) strongly affected the prices. Among the tested models for predicting prices of shrimp, four were selected, which through the prediction methodologies of one step forward of horizon 12, proved to be statistically more robust. It was found that there is weak evidence of long-term equilibrium between the Brazilian and American shrimp, where equivalently, was not found a causal link between them. We concluded that the dynamic pricing of commodity shrimp is strongly influenced by external productive factors and that these phenomena cause seasonal effects in the prices. There is no relationship of long-term stability between the Brazilian and American shrimp prices, but it is known that Brazil imports USA production inputs, which somehow shows some dependence productive. To the market agents, the risk of interferences of the external prices cointegrated to Brazilian is practically inexistent. Through statistical modeling is possible to minimize the risk and uncertainty embedded in the fish market, thus, the sales and marketing strategies for the Brazilian shrimp can be consolidated and widespread
Resumo:
This thesis is a research study, which is aimed to be published later, in the form of a practical guide for those who have an idea and plan to go one step forward on the creation of a brand and/or a business. The main questions addressed are regarding the main concerns of an entrepreneur, identifiying the main topics that an entrepreneur's practical guide must approach. This work aims to provide relevant and important insights for those who want to start a business, taking in consideration some best practices, advises from entrepreneurs that have already started their own businesses and shared their experience. It means to provide a strong contribution to the Portuguese ecosystem, more specifically, startups, small companies, projects or ideas at seed or startup stage, brands, clubs, and every initiative which is starting from nothing, or almost nothing. Apart from books and online researches, primary information and testimonials were collected through an online survey, from a target audience of entrepreneurs, leading to the main findings of this study. The conclusion of this thesis is the gross index of the future startuper's practical guide, which will be divided in 4 different stages: Preparation, Implementation, Leverage and Closure.
Resumo:
Compte-rendu / Review
Resumo:
With its powerful search engines and billions of published pages, the Worldwide Web has become the ultimate tool to explore the human experience. But, despite the advent of the digital revolution, e-books, at their core, have remained remarkably similar to their printed siblings. This has resulted in a clear dichotomy between two ways of reading: on one side, the multi-dimensional world of the Web; on the other, the linearity of books and e-books. My investigation of the literature indicates that the focus of attempts to merge these two modes of production, and hence of reading, has been the insertion of interactivity into fiction. As I will show in the Literature Review, a clear thrust of research since the early 1990s, and in my opinion the most significant, has concentrated on presenting the reader with choices that affect the plot. This has resulted in interactive stories in which the structure of the narrative can be altered by the reader of experimental fiction. The interest in this area of research is not surprising, as the interaction of readers with the fabric of the narrative provides a fertile ground for exploring, analysing, and discussing issues of plot consistency and continuity. I found in the literature several papers concerned with the effects of hyperlinking on literature, but none about how hyperlinked material and narrative could be integrated without compromising the narrative flow as designed by the author. It led me to think that the researchers had accepted hypertextuality and the linear organisation of fiction as being antithetical, thereby ignoring the possibility of exploiting the first while preserving the second. All the works I consulted were focussed on exploring the possibilities provided to authors (and readers) by hypertext or how hypertext literature affects literary criticism. This was true in earlier works by Landow and Harpold and remained true in later works by Bolter and Grusin. To quote another example, in his book Hypertext 3.0, Landow states: “Most who have speculated on the relation between hypertextuality and fiction concentrate [...] on the effects it will have on linear narrative”, and “hypertext opens major questions about story and plot by apparently doing away with linear organization” (Landow, 2006, pp. 220, 221). In other words, the authors have added narrative elements to Web pages, effectively placing their stories in a subordinate role. By focussing on “opening up” the plots, the researchers have missed the opportunity to maintain the integrity of their stories and use hyperlinked information to provide interactive access to backstory and factual bases. This would represent a missing link between the traditional way of reading, in which the readers have no influence on the path the author has laid out for them, and interactive narrative, in which the readers choose their way across alternatives, thereby, at least to a certain extent, creating their own path. It would be, to continue the metaphor, as if the readers could follow the main path created by the author while being able to get “sidetracked” into exploring hyperlinked material. In Hypertext 3.0, Landow refers to an “Axial structure [of hypertext] characteristic of electronic books and scholarly books with foot-and endnotes” versus a “Network structure of hypertext” (Landow, 2006, p. 70). My research aims at generalising the axial structure and extending it to fiction without losing the linearity at its core. In creative nonfiction, the introduction of places, scenes, and settings, together with characterisation, brings to life the facts without altering them; while much fiction draws on facts to provide a foundation, or narrative elements, for the work. But how can the reader distinguish between facts and representations? For example, to what extent do dialogues and perceptions present what was actually said and thought? Some authors of creative nonfiction use end-notes to provide comments and citations while minimising disruption the flow of the main text, but they are limited in scope and constrained in space. Each reader should be able to enjoy the narrative as if it were a novel but also to explore the facts at the level of detail s/he needs. For this to be possible, end-notes should provide a Web-like way of exploring in more detail what the author has already researched. My research aims to develop ways of integrating narrative prose and hyperlinked documents into a Hyperbook. Its goal is to create a new writing paradigm in which a story incorporates a gateway to detailed information. While creative nonfiction uses the techniques of fictional writing to provide reportage of actual events and fact-based fiction illuminates the affectual dimensions of what happened (e.g., Kate Grenville’s The Secret River and Hilary Mantel’s Wolf Hall), Hyperbooks go one step further and link narrative prose to the details of the events on which the narrative is based or, more in general, to information the reader might find of interest. My dissertation introduces and utilises Hyperbooks to engage in two parallel types of investigation Build knowledge about Italian WWII POWs held in Australia and present it as part of a novella in Hyperbook format. Develop a new piece of technology capable of extending the writing and reading process.
Resumo:
Aim: To evaluate the effects of 10% NaOCl gel application on the dentin bond strengths and morphology of resin-dentin interfaces formed by three adhesives. Methods: Two etch-and-rinse adhesives (One-Step Plus, Bisco Inc. and Clearfil Photo Bond, Kuraray Noritake Dental) and one self-etch adhesive (Clearfil SE Bond, Kuraray Noritake Dental) were applied on dentin according to the manufacturers’ instructions or after the treatment with 10% NaOCl (ED-Gel, Kuraray Noritake Dental) for 60 s. For interfacial analysis, specimens were subjected to acid-base challenge and observed by SEM to identify the formation of the acid-base resistant zone (ABRZ). For microtensile bond strength, the same groups were investigated and the restored teeth were thermocycled (5,000 cycles) or not before testing. Bond strength data were subjected to two-way ANOVA and Tukey’s test (p<0.05). Results: NaOCl application affected the bond strengths for One-Step Plus and Clearfil Photo Bond. Thermocycling reduced the bond strengths for Clearfil Photo Bond and Clearfil SE Bond when used after NaOCl application and One-Step Plus when used as recommended by manufacturer. ABRZ was observed adjacent to the hybrid layer for self-etch primer. The etch-and-rinse systems showed external lesions after acid-base challenge and no ABRZ formation when applied according to manufacturer’s instructions. Conclusions: 10% NaOCl changed the morphology of the bonding interfaces and its use with etch-&-rinse adhesives reduced the dentin bond strength. Formation of ABRZ was material-dependent and the interface morphologies were different among the tested materials.
Resumo:
Doutoramento em Economia
Resumo:
Aim: To evaluate the dislocation resistance of the quartz fiber post/cement/dentin interface after different adhesion strategies. Methods: Forty bovine lower central incisors were selected and prepared with K-files using the step-back technique, and irrigated with 3 mL of distilled water preceding the use of each instrument. Prepared teeth were stored at 37ºC and 100% humidity for 7 days. The roots were prepared and randomized into 4 groups. The quartz fiber post was cemented with an adhesion strategy according to the following groups: GBisCem- BISCEM; GOneStep±C&B- One Step ± C&B; GAllBond±C&B- AllBond3 ± C&B; GAllBondSE±C&B- AllBondSE ±C&B with a quartz fiber post. Cross-sectional root slices of 0.7 mm were produced and stored for 24 h at 37° C before being submitted to push-out bond strength. Results: The mean and standard deviation values of dislocation resistance were GBisCem: 1.12 (± 0.23) MPa, GOneStep±C&B: 0.81 (± 0.31) MPa, GAllBond±C&B: 0.98 (± 0.14) MPa, and GAllBondSE±C&B: 1.57 (± 0.04) MPa. GAllBondSE±C&B showed significantly higher values of dislocation resistance than the other groups. Conclusions: Based on this study design, it may be concluded that adhesion strategies showed different results of quartz post dislocation resistance. Simplified adhesive system with sodium benzene sulphinate incorporation provided superior dislocation resistance.
Resumo:
Na Marinha Portuguesa, o emprego de Unmanned Underwater Vehicles (UUV) tem uma utilização muito limitada, restringe-se unicamente à deteção de minas. Contudo, com a evolução tecnológica e científica, o seu uso poderá estar a um passo de ser usado em outras vertentes cuja aplicabilidade ainda não foi explorada. Nesta linha de pensamento, surgiu o projeto ICARUS (Integrated Components for Assisted Rescue and Unmanned Search operations), que visa o desenvolvimento de veículos não tripulados para a busca e salvamento. O objetivo do mesmo, resume-se ao salvamento de náufragos com o recurso a UUV, promovendo assim uma eficiente gestão dos recursos, objetivo contemplado na diretiva de planeamento de marinha. Assim, com base no projeto desenvolvido nas teses do ano transato pelos ASPOF Maia da Fonseca e Ramos da Palma, pretende-se com a presente dissertação através de um sistema sonar instalado num UUV em modo upward looking, avaliar a viabilidade na deteção de um náufrago à deriva no mar através das suas leituras. Para tal recorre-se à simulação com o náufrago em diferentes posições e em ambientes mais adequados à realidade que é o mar. E, ainda a otimização das características que permitem a identificação do náufrago.
Resumo:
El presente trabajo de titulación denominado Texto Guía para Docentes enfocado en el bloque de Matemáticas Discretas del Primero B.G.U, ha sido desarrollado con la finalidad de presentar un aporte significativoy de ayuda al docente de Matemáticas de Primero de Bachillerato, anhelando un mejor desenvolvimiento dentro del aula de clase. Este documento está elaborado en base a la legislación educativa ecuatoriana vigente y de los documentos oficiales del Ministerio de Educación, el tema propuesto corresponde al tercer bloque curricular del primer año de Bachillerato General Unificado en la asignatura de Matemáticas. Nuestro trabajo de titulación se compone de tres capítulos. En el capítulo uno, se presenta una síntesis de temas como la evolución de la educación ecuatoriana, los modelos pedagógicos, los métodos de enseñanza, didáctica de la matemática y programación lineal, considerados como base para el desarrollo de la propuesta. En el capítulo dos, se detalla la investigación estadística realizada mediante una encuesta aplicada a docentes de Matemáticas de Primer año de Bachillerato, pertenecientes a la Coordinación Zonal 6 de Educación, Distrito Norte. Los resultados encontrados cimentaron la propuesta de la implementación del texto guía para el aprendizaje de Matemáticas Discretas. En el capítulo tres se elabora la propuesta del texto guía, estructurado en seis guías didácticas, cada una corresponde al desarrollo de una destreza con criterio de desempeñopara el tema planteado. Al final de este capítulo, se detallan conclusiones y recomendaciones dirigidas para el docente de matemáticas.
Resumo:
The diversity in the way cloud providers o↵er their services, give their SLAs, present their QoS, or support di↵erent technologies, makes very difficult the portability and interoperability of cloud applications, and favours the well-known vendor lock-in problem. We propose a model to describe cloud applications and the required resources in an agnostic, and providers- and resources-independent way, in which individual application modules, and entire applications, may be re-deployed using different services without modification. To support this model, and after the proposal of a variety of cross-cloud application management tools by different authors, we propose going one step further in the unification of cloud services with a management approach in which IaaS and PaaS services are integrated into a unified interface. We provide support for deploying applications whose components are distributed on different cloud providers, indistinctly using IaaS and PaaS services.
Resumo:
Protein purification plays a crucial role in biotechnology and biomanufacturing, where downstream unit operations account for 40%-80% of the overall costs. To overcome this issue, companies strive to simplify the separation process by reducing the number of steps and replacing expensive separation devices. In this context, commercially available polybutylene terephthalate (PBT) melt-blown nonwoven membranes have been developed as a novel disposable membrane chromatography support. The PBT nonwoven membrane is able to capture products and reduce contaminants by ion exchange chromatography. The PBT nonwoven membrane was modified by grafting a poly(glycidyl methacrylate) (GMA) layer by either photo-induced graft polymerization or heat induced graft polymerization. The epoxy groups of GMA monomer were subsequently converted into cation and anion exchangers by reaction with either sulfonic acid groups or diethylamine (DEA), respectively. Several parameters of the procedure were studied, especially the effect of (i) % weight gain and (ii) ligand density on the static protein binding capacity. Bovine Serum Albumin (BSA) and human Immunoglobulin G (hIgG) were utilized as model proteins in the anion and cation exchange studies. The performance of ion exchange PBT nonwovens by HIG was evaluated under flow conditions. The anion- and cation- exchange HIG PBT nonwovens were evaluated for their ability to selectively adsorb and elute BSA or hIgG from a mixture of proteins. Cation exchange nonwovens were not able to reach a good protein separation, whereas anion exchange HIG nonwovens were able to absorb and elute BSA with very high value of purity and yield, in only one step of purification.
Resumo:
We report the simplification and development of biofunctionalization methodology based on one-step 1-ethyl-3-(3-dimethylaminopropyl)carbodiimide (EDC)-mediated reaction. The dual-peak long period grating (dLPG) has been demonstrated its inherent ultrahigh sensitivity to refractive index (RI), achieving 50-fold improvement in RI sensitivity over a standard LPG sensor used in low RI range. With the simple and efficient immobilization of unmodified oligonucleotides on sensor surface, dLPG-based biosensor has been used to monitor the hybridization of complementary oligonucleotides showing a detectable oligonucleotide concentration of 4 nM with the advantages of label-free, real-time, and ultrahigh sensitivity.