946 resultados para solution-based DNA extraction
Resumo:
Novel ionic liquids containing ampicillin as an active pharmaceutical ingredient anion were prepared with good yields by using a new, efficient synthetic procedure based on the neutralization of a moderately basic ammonia solution of ampicillin with different organic cation hydroxides. The relevant physical and thermal properties of these novel ionic liquids based on ampicillin were also evaluated.
Resumo:
This paper presents a new and efficient methodology for distribution network reconfiguration integrated with optimal power flow (OPF) based on a Benders decomposition approach. The objective minimizes power losses, balancing load among feeders and subject to constraints: capacity limit of branches, minimum and maximum power limits of substations or distributed generators, minimum deviation of bus voltages and radial optimal operation of networks. The Generalized Benders decomposition algorithm is applied to solve the problem. The formulation can be embedded under two stages; the first one is the Master problem and is formulated as a mixed integer non-linear programming problem. This stage determines the radial topology of the distribution network. The second stage is the Slave problem and is formulated as a non-linear programming problem. This stage is used to determine the feasibility of the Master problem solution by means of an OPF and provides information to formulate the linear Benders cuts that connect both problems. The model is programmed in GAMS. The effectiveness of the proposal is demonstrated through two examples extracted from the literature.
Resumo:
Proteins are biochemical entities consisting of one or more blocks typically folded in a 3D pattern. Each block (a polypeptide) is a single linear sequence of amino acids that are biochemically bonded together. The amino acid sequence in a protein is defined by the sequence of a gene or several genes encoded in the DNA-based genetic code. This genetic code typically uses twenty amino acids, but in certain organisms the genetic code can also include two other amino acids. After linking the amino acids during protein synthesis, each amino acid becomes a residue in a protein, which is then chemically modified, ultimately changing and defining the protein function. In this study, the authors analyze the amino acid sequence using alignment-free methods, aiming to identify structural patterns in sets of proteins and in the proteome, without any other previous assumptions. The paper starts by analyzing amino acid sequence data by means of histograms using fixed length amino acid words (tuples). After creating the initial relative frequency histograms, they are transformed and processed in order to generate quantitative results for information extraction and graphical visualization. Selected samples from two reference datasets are used, and results reveal that the proposed method is able to generate relevant outputs in accordance with current scientific knowledge in domains like protein sequence/proteome analysis.
Resumo:
Involving groups in important management processes such as decision making has several advantages. By discussing and combining ideas, counter ideas, critical opinions, identified constraints, and alternatives, a group of individuals can test potentially better solutions, sometimes in the form of new products, services, and plans. In the past few decades, operations research, AI, and computer science have had tremendous success creating software systems that can achieve optimal solutions, even for complex problems. The only drawback is that people don’t always agree with these solutions. Sometimes this dissatisfaction is due to an incorrect parameterization of the problem. Nevertheless, the reasons people don’t like a solution might not be quantifiable, because those reasons are often based on aspects such as emotion, mood, and personality. At the same time, monolithic individual decisionsupport systems centered on optimizing solutions are being replaced by collaborative systems and group decision-support systems (GDSSs) that focus more on establishing connections between people in organizations. These systems follow a kind of social paradigm. Combining both optimization- and socialcentered approaches is a topic of current research. However, even if such a hybrid approach can be developed, it will still miss an essential point: the emotional nature of group participants in decision-making tasks. We’ve developed a context-aware emotion based model to design intelligent agents for group decision-making processes. To evaluate this model, we’ve incorporated it in an agent-based simulator called ABS4GD (Agent-Based Simulation for Group Decision), which we developed. This multiagent simulator considers emotion- and argument based factors while supporting group decision-making processes. Experiments show that agents endowed with emotional awareness achieve agreements more quickly than those without such awareness. Hence, participant agents that integrate emotional factors in their judgments can be more successful because, in exchanging arguments with other agents, they consider the emotional nature of group decision making.
Resumo:
In this paper we present results on the use of a semiconductor heterostructure based on a-SiC:H as a wavelength-division demultiplexer for the visible light spectrum. The proposed device is composed of two stacked p-i-n photodiodes with intrinsic absorber regions adjusted to short and long wavelength absorption and carrier collection. An optoelectronic characterisation of the device was performed in the visible spectrum. Demonstration of the device functionality for WDM applications was done with three different input channels covering the long, the medium and the short wavelengths in the visible range. The recovery of the input channels is explained using the photocurrent spectral dependence on the applied voltage. An electrical model of the WDM device is proposed and supported by the solution of the respective circuit equations. Short range optical communications constitute the major application field however other applications are foreseen. (C) 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Resumo:
This paper presents the study of the remediation of sandy soils containing six of the most common contaminants (benzene, toluene, ethylbenzene, xylene, trichloroethylene and perchloroethylene) using soil vapour extraction (SVE). The influence of soil water content on the process efficiency was evaluated considering the soil type and the contaminant. For artificially contaminated soils with negligible clay contents and natural organic matter it was concluded that: (i) all the remediation processes presented efficiencies above 92%; (ii) an increase of the soil water content led to a more time-consuming remediation; (iii) longer remediation periods were observed for contaminants with lower vapour pressures and lower water solubilities due to mass transfer limitations. Based on these results an easy and relatively fast procedure was developed for the prediction of the remediation times of real soils; 83% of the remediation times were predicted with relative deviations below 14%.
Resumo:
Liver steatosis is a common disease usually associated with social and genetic factors. Early detection and quantification is important since it can evolve to cirrhosis. In this paper, a new computer-aided diagnosis (CAD) system for steatosis classification, in a local and global basis, is presented. Bayes factor is computed from objective ultrasound textural features extracted from the liver parenchyma. The goal is to develop a CAD screening tool, to help in the steatosis detection. Results showed an accuracy of 93.33%, with a sensitivity of 94.59% and specificity of 92.11%, using the Bayes classifier. The proposed CAD system is a suitable graphical display for steatosis classification.
Resumo:
PURPOSE: Fatty liver disease (FLD) is an increasing prevalent disease that can be reversed if detected early. Ultrasound is the safest and ubiquitous method for identifying FLD. Since expert sonographers are required to accurately interpret the liver ultrasound images, lack of the same will result in interobserver variability. For more objective interpretation, high accuracy, and quick second opinions, computer aided diagnostic (CAD) techniques may be exploited. The purpose of this work is to develop one such CAD technique for accurate classification of normal livers and abnormal livers affected by FLD. METHODS: In this paper, the authors present a CAD technique (called Symtosis) that uses a novel combination of significant features based on the texture, wavelet transform, and higher order spectra of the liver ultrasound images in various supervised learning-based classifiers in order to determine parameters that classify normal and FLD-affected abnormal livers. RESULTS: On evaluating the proposed technique on a database of 58 abnormal and 42 normal liver ultrasound images, the authors were able to achieve a high classification accuracy of 93.3% using the decision tree classifier. CONCLUSIONS: This high accuracy added to the completely automated classification procedure makes the authors' proposed technique highly suitable for clinical deployment and usage.
Resumo:
In this work the identification and diagnosis of various stages of chronic liver disease is addressed. The classification results of a support vector machine, a decision tree and a k-nearest neighbor classifier are compared. Ultrasound image intensity and textural features are jointly used with clinical and laboratorial data in the staging process. The classifiers training is performed by using a population of 97 patients at six different stages of chronic liver disease and a leave-one-out cross-validation strategy. The best results are obtained using the support vector machine with a radial-basis kernel, with 73.20% of overall accuracy. The good performance of the method is a promising indicator that it can be used, in a non invasive way, to provide reliable information about the chronic liver disease staging.
Resumo:
In this work, we present a neural network (NN) based method designed for 3D rigid-body registration of FMRI time series, which relies on a limited number of Fourier coefficients of the images to be aligned. These coefficients, which are comprised in a small cubic neighborhood located at the first octant of a 3D Fourier space (including the DC component), are then fed into six NN during the learning stage. Each NN yields the estimates of a registration parameter. The proposed method was assessed for 3D rigid-body transformations, using DC neighborhoods of different sizes. The mean absolute registration errors are of approximately 0.030 mm in translations and 0.030 deg in rotations, for the typical motion amplitudes encountered in FMRI studies. The construction of the training set and the learning stage are fast requiring, respectively, 90 s and 1 to 12 s, depending on the number of input and hidden units of the NN. We believe that NN-based approaches to the problem of FMRI registration can be of great interest in the future. For instance, NN relying on limited K-space data (possibly in navigation echoes) can be a valid solution to the problem of prospective (in frame) FMRI registration.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
Solution enthalpies of 1,4-dioxane have been obtained in 15 protic and aprotic solvents at 298.15 K. Breaking the overall process through the use of Solomonov's methodology the cavity term was calculated and interaction enthalpies (Delta H-int) were determined. Main factors involved in the interaction enthalpy have been identified and quantified using a QSPR approach based on the TAKA model equation. The relevant descriptors were found to be pi* and beta, which showed, respectively, exothermic and endothermic contributions. The magnitude of pi* coefficient points toward non-specific solute-solvent interactions playing a major role in the solution process. The positive value of the beta coefficient reflects the endothermic character of the solvents' hydrogen bond acceptor (HBA) basicity contribution, indicating that solvent molecules engaged in hydrogen bonding preferentially interact with each other rather than with 1,4-dioxane. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Hoje em dia, a prevenção dos resíduos de metais é uma questão muito importante para um grande número de empresas, pois necessitam optimizar o seu sistema de tratamento de águas residuais a fim de alcançarem os limites legais dos teores em iões metálicos e poderem efectuar a descarga das águas residuais no domínio hídrico público. Devido a esta problemática foram efectuados estudos inovadores relacionados com a remoção de iões metálicos de águas residuais, verificando-se que as tecnologias de membrana oferecem uma série de vantagens para o efeito. Uma dessas tecnologias, referida como Membrana Líquida de Suporte (SLM), é baseada num mecanismo de extracção. A membrana hidrofóbica, impregnada com uma solução extractora, funciona como barreira entre a água residual e uma solução, geralmente ácida. A diferença de pH entre a água residual e a solução actua como força motriz para o transporte de iões metálicos da água residual para a referida solução. Poderá ocorrer um problema de falta de estabilidade, resultante da possível fuga da solução extractora para fora dos poros das membranas. Estudos anteriores mostraram que os ácidos alquilfosfóricos ou ácidos fosfónicos, como os reagentes D2EHPA e CYANEX e hidroxioximas como o LIX 860-I podem ser muito úteis para a extração de iões metálicos como ferro, cobre, níquel, zinco e outros. A clássica extracção líquido-líquido também tem mostrado que a mistura de diferentes extractores pode ter um efeito sinergético. No entanto, não é claro que haja um efeito óptimo da razão de extractor ou que tipo de complexo é formado durante o processo de extracção. O objectivo deste projecto é investigar este comportamento sinergético e as complexas formações por meio de um método espectrofotométrico, o “Job’s method” e “Mole-ratio method”. Estes métodos são utilizados para estimar a estequiometria dos vários complexos entre dois solutos, a partir da variação de absorvância dos complexos quando comparado com a absorvância do soluto. Com este projecto, o Job’s method e mole-ratio method serão aplicados a um sistema de três componentes, para conseguir mais informações sobre a complexação de níquel (II) e a fim de determinar a razão extractor: metal dos complexos formados durante a aplicação de mistura de extractores D2EHPA e LIX 860-I. Segundo Job’s method a elavada absorvância situa-se na região de 0,015-0,040 M de LIX 860-I e uma baixa concentração de D2EHPA. Quando as diferentes experiências são encontradas num conjunto experimental foram avaliadas de acordo com o método de trabalho, o valor máximo do gráfico foi encontrado para uma baixa fração molar do ião metálico e uma maior concentração de D2EHPA. Esta mudança foi encontrado de 0,50 até 0,30, que poderia apontar para a direção da formação de diferentes complexos. Para o Mole-Ratio method, a estequiometria dos complexos metal pode ser determinada a partir do ponto de intersecção das linhas tangente do gráfico da absorbância versus a concentração do ligante. Em todos os casos, o máximo foi obtido em torno de uma concentração total de 0,010 M. Quando D2EHPA foi aplicado sozinho, absorvâncias muito baixos foram obtidas.
Oxidative Leaching of metals from electronic waste with solutions based on quaternary ammonium salts
Resumo:
The treatment of electric and electronic waste (WEEE) is a problem which receives ever more attention. An inadequate treatment results in harmful products ending up in the environment. This project intends to investigate the possibilities of an alternative route for recycling of metals from printed circuit boards (PCBs) obtained from rejected computers. The process is based on aqueous solutions composed of an etchant, either 0.2 M CuCl2.2H2O or 0.2 M FeCl3.6H2O, and a quaternary ammonium salt (quat) such as choline chloride or chlormequat. These solutions are reminiscent of deep eutectic solvents (DES) based on quats. DES are quite similar to ionic liquids (ILs) and are used as well as alternative solvents with a great diversity of physical properties, making them attractive for replacement of hazardous, volatile solvents (e.g. VOCs). A remarkable difference between genuine DES and ILs with the solutions used in this project is the addition of rather large quantities of water. It is shown the presence of water has a lot of advantages on the leaching of metals, while the properties typical for DES still remain. The oxidizing capacities of Cu(II) stem from the existence of a stable Cu(I) component in quat based DES and thus the leaching stems from the activity of the Cu(II)/Cu(I) redox couple. The advantage of Fe(III) in combination with DES is the fact that the Fe(III)/Fe(II) redox couple becomes reversible, which is not true in pure water. This opens perspectives for regeneration of the etching solution. In this project the leaching of copper was studied as a function of gradual increasing water content from 0 - 100w% with the same concentration of copper chloride or iron(III) chloride at room temperature and 80ºC. The solutions were also tested on real PCBs. At room temperature a maximum leaching effect for copper was obtained with 30w% choline chloride with 0.2 M CuCl2.2H2O. The leaching effect is still stronger at 80°C, b ut of course these solutions are more energy consuming. For aluminium, tin, zinc and lead, the leaching was faster at 80ºC. Iron and nickel dissolved easily at room temperature. The solutions were not able to dissolve gold, silver, rhodium and platinum.
Resumo:
The ECG signal has been shown to contain relevant information for human identification. Even though results validate the potential of these signals, data acquisition methods and apparatus explored so far compromise user acceptability, requiring the acquisition of ECG at the chest. In this paper, we propose a finger-based ECG biometric system, that uses signals collected at the fingers, through a minimally intrusive 1-lead ECG setup recurring to Ag/AgCl electrodes without gel as interface with the skin. The collected signal is significantly more noisy than the ECG acquired at the chest, motivating the application of feature extraction and signal processing techniques to the problem. Time domain ECG signal processing is performed, which comprises the usual steps of filtering, peak detection, heartbeat waveform segmentation, and amplitude normalization, plus an additional step of time normalization. Through a simple minimum distance criterion between the test patterns and the enrollment database, results have revealed this to be a promising technique for biometric applications.